Impact of Second-Gen 3DTV on STB

Pay TV operators are the ones who will have to prove the viability of 3DTV to the home and while they are creating a market for the new services, Frame Compatible 3D technology means they can use their existing HDTV infrastructure (particularly encoders and set-top boxes) and avoid significant investments.

However, the wider broadcast industry is openly discussing whether there is a need for a second generation 3DTV transmission technology and what form it should take, with some broadcasters keen on Service Compatible 3D (where the 2D and 3D services are transmitted in a single signal and 2D set-tops can view the content in 2D while any new 3D STBs produce 3DTV content).

We asked a leading video SoC (Silicon on Chip) vendor, Sigma Designs, about the implications of ‘second generation’ 3DTV transmission technologies on the set-top box. Vincent Harradine, Director of Systems Engineering at the company, provides the answers.

What is the impact on customer premise equipment of frame compatible 3DTV services?
One of the main reasons for the existence of such frame compatible formats is their ability to be supported by existing/legacy infrastructure and equipment such as set-top boxes. An existing set-top box can support the decode and output of 3DTV frame compatible formats, e.g. side-by-side or top-and-bottom. Frame compatible formats do not require any additional processing or formatting beyond decoding.

Dependence is upon the display device, e.g. 3D TV, to take care of formatting to typically a frame sequential or page flip format for display. Additionally, the latest HDMI 1.4a specification includes such frame compatible formats as mandatory, plus all 3D TVs currently available are forced to receive and display such frame compatible formats.

All of this can be achieved without any firmware update to the STB but does rely on the customer having a 3D ready TV.

Can existing HD STBs support full 1080p50/60 HD if platform operators wanted to improve the quality of 3D television by increasing the quality of the HD used in frame compatible mode?
Two main issues result in 1080p50/60 support not being readily available with today’s STBs. First, the lack of current video decoder SoC device support for the decoding of 1080p50/60. Second, the HDMI transmitter physical layer being restricted to a maximum pixel clock rate of only 74.25MHz, which is insufficient to support the 1080p50/60 pixel clock rate of 148.5MHz.

What are the key challenges in enabling set-top boxes to support full 3DTV? What is the likely premium for such set-tops? When are we likely to see them?
Support for full 3DTV will require new STB hardware allowing for the support of the appropriate 3D format (e.g. base layer plus enhancement or 2D plus metadata) along with HDMI transmitter support for the necessary data rate increase. Initially there could be a market for such a premium or high-end full resolution STB box. However, over time it will likely become mainstream especially as consumers become aware of the quality difference between BD [Blu-ray disk] (MVC full 3DTV) versus frame compatible (e.g. side-by-side) for the same content.

What challenges would 2D + Delta provide in the STB if operators used this to enable one video stream that could be used for 2D television on 2D STBs and 3DTV on 3D capable STBs?
Typically today’s SoC decoder/media processors are incapable of support for the decoding of 2D + Delta. This means STB upgrades would be necessary; in the absolute best case a firmware upgrade may suffice but in most cases it would require new STB hardware. 3D signalling would also be a necessary requirement, allowing for the appropriate handling of content in 2D versus 3D environments.

Do you think using a single signal for 2D and 3D is realistic - or are 2D and 3D television too different creatively (e.g. camera positions, the way content is shot) for this to work?
For the vast majority of content this will work. Issues could potentially arise over content shot for large screen (movies) versus small screen (TV). But then this is the basis around the selection of MVC as a coding format for Blu-ray (BD) where, in theory, 3D BDs should be capable of playback on existing 2D systems by displaying the single full frame image.

What challenges would 'frame compatible plus enhancement' have for the STB (where frame compatible 3D can be viewed on a normal HD set-top box, as today, but an enhancement layer can also be used to create full 3DTV from the same signal in homes where new, full 3DTV STBs are deployed)?
Today’s STBs are not capable of supporting frame compatible plus enhancement without at least a firmware upgrade. Even a firmware upgrade is a very remote possibility. Almost certainly, new STB hardware will be necessary to allow for such support. Once again, appropriate 3D signalling must be standardized and realized, allowing for hands-free handling of such content.

How realistic is Multiview Video Coding (MVC), as seen in 3D Blu-ray players, for set-top boxes? What cost premium is there compared to standard HD video decoding? When could MVC be supported in STBs?
Sigma Designs will have silicon available in the fall that supports both frame compatible and full 3DTV formats, such as MVC, so it is likely such STBs could become available during 2011. Due to the decoding and subsequent management of two full left and right images, the need for memory bandwidth increases accordingly. This has the potential to increase the BOM [Bill of Materials] cost of STBs. However in the case of the Sigma Designs solution, we have the means for intelligent handling of data for more efficient management of the increased data bandwidth.

By John Moulding, Videonet

Sony Builds 2D-3D Conversion Tool

Sony is to build realtime 2D to 3D conversion capability into its 3D Processor box, while a potentially serious power shortage involving the 3D setup nearly derailed World Cup production.

Sony is taking stock of the technical lessons it learned during the intense 3D match day production schedule during the recent World Cup tournament. Future iterations of its MPE-200 3D processor are likely to include 2D to 3D conversion, finessed colour correction, QC for 3D on ingest into edit suites or before transmission and improved configuration tools between the lenses, rigs and processor box.

JVC’s IF-2D3D1 video image processor was used to convert 2D images shot from helicopter, Spidercam and some pitchside steadicams for the production.

“One of the things we are sensitive to is that when companies buy into the hardware power of the MPE-200 processor they need to look at return on investment around 3D,” explains Mark Grinyer, Head of Sports Business, Sony Professional. “We’re looking at what we can do around the hardware platform (based on the Cell engine which drives the PS3) and 3D conversion is one of those ideas. For outside broadcasters the flexibility of a production tool is also key. We want to automate as many 3D processes as possible with this platform.”

By all accounts the system itself worked almost without hitch during the 25 match 3D run. According to Duncan Humphreys, Creative Director of CAN Communicate who was technical consultant to production team HBS for the production: “It was a very smooth operation. I always believed eventually that 3D correction would be done digitally rather than mechanically and what Sony has achieved in such a short period of time is unbelievable.

“We had virtually no problem with the 3D boxes. In fact it’s a game-changer – producing quality live 3D with standard broadcast lenses and cameras. The one thing that will kill off 3D is expensive 3D, because broadcasters are not going to pay the kind of premiums that are being required at the very top end.”

Humphreys revealed that the production team had to solve one major technical glitch during tests in the run up to South Africa. The fault, which took a month to identify and fix, turned out to be an issue with the Sony HDC-1500’s power supply.

“The cameras were working just fine until we went into live trials and the system was zooming strangely and at one point failed all together,” says Humphreys. “We eventually realised that since we were powering the Element Technica rigs including all zooming and lens matching alteration, and communications traffic that there wasn’t enough power in the cameras to be able to do everything we needed.”

The problem was resolved by attaching an external power supply to the rigs although Sony is addressing the issue. “In stadia or other venues with power sources there’s no problem but in certain locations it could be, so we’ll look into it,” says Grinyer.

The next version of the Sony 3D boxes’ software will be released shortly after IBC. At the Amsterdam event Sony will also reveal the next stage of its prototype 3D single bodied dual lens camcorder which is not thought to be radically different in design from that released earlier this year by Panasonic.

By Adrian Pennington, TVB Europe

3D or Not 3D: The Road Ahead for TV

Broadcast transmission specialist, Broadcast Australia, has released a white paper which explores the challenge of establishing Australia’s new 3D TV environment, and highlights the importance of laying the foundations to ensure the country’s free-to-air broadcasting infrastructure is ready to deliver new and unique digital content.

The paper, entitled 3D or Not 3D: The Road Ahead for TV, provides insight into the current 3D TV environment, exploring the consumer proposition and drivers for deployment, as well as considering the various technology options, how they work, and what issues need to be addressed in order to make 3D TV successful.

To date, Broadcast Australia has played a central role in the delivery of terrestrial 3D TV and puts a case forward for the development of appropriate policies—with regard to applicable standards, licensing frameworks and spectrum—to provide an evolution path for the future deployment of terrestrial 3D TV services.

Panasonic Introduces World's First1 3D Consumer Camcorder

Panasonic unveiled the world's first 3D camcorder for consumers. The HDC-SDT750 camcorder allows anyone to create powerful, true-to-life 3D images by simply attaching a 3D conversion lens that comes with the camcorder. The camcorder will go on sale in Japan on August 20, to be followed in other countries in autumn of this year.

Even without the 3D conversion lens attached, there are many ways to enjoy the SDT750 camcorder. The 3MOS system with improved noise reduction (NR) technologies records dimly lit images in greater beauty than ever before. Other sophisticated functions include 1080/60p for NTSC or 1080/50p for PAL recording (Full-HD 1,920 x 1,080, 60 or 50 progressive recording) for ultra-smooth images, iA (Intelligent Auto) mod5 in the new HYBRID O.I.S. (Optical Image Stabilization/Optical Image Stabilizer), and a wealth of manual functions controlled by a manual ring for easy, creative shooting.


Panasonic HDC-SDT750

The high-sensitivity 3MOS System provides an effective motion image pixel count of 7,590,000 pixels (2,530,000 pixels x 3). And even with this high pixel count, newly developed technology increases sensitivity, and further-evolved NR technology achieves bright images with minimal noise when shooting under low light conditions. This maximizes the 3MOS features of excellent color reproduction, high resolution and rich gradation, and lets the user capture vividly colored images in both bright and darkly lit places.

Recording in 1,080/60p for NTSC or 1,080/50p for PAL, the SDT750 camcorder produces richly expressive images, with none of the detail loss and flickering of the conventional 1,080i (interlace) recording. In addition, the iA function, which was highly popular on previous models, makes it easy for anyone to take beautiful videos. The SDT750 camcorder also newly incorporates the HYBRID O.I.S. system5 to bring clear, beautiful HD image quality to telephoto shots as well.

Source: Panasonic

3D: How Video Compression Technology can Contribute

This white paper presents the quality tradeoffs of the most popular 3D packing schemes from a video compression perspective with particular attention paid to the specific AVC/H.264 artifacts that they might cause.

By Pierre Larbier, ATEME

Biz Brains Ponder 3D Experience

Directors with 3D experience, including James Cameron and Eric Brevig, have said for years they believe that stereoscopic 3D content affects the brain differently than 2D content.
It turns out, there's significant scholarly research to back up that idea, some of it coming from neuroscientists within the biz.

Two bizzers with a neuroscience background are now working for the Legend3D shingle, which does 2D to 3D conversion: founder/prexy Barry Sandrew and new hire Toni Pace Carstensen. Both say research on 3D and the emerging field of "neurocinematics" show 3D affects viewers differently than traditional cinema.

Because binocular vision is natural, explained Sandrew, the brain expects to see a different view with each eye of objects nearby. When it doesn't, it gives what it's looking at less importance.

On the other hand, when the brain sees something with that "binocular disparity," it reacts very differently, especially if there's something flying off the screen.

Sandrew notes viewers don't react much in a 2D film if something flies toward camera, but viewers duck when the same thing happens in 3D. The difference arises because the stereo view activates a very fast pathway in the brain that stimulates the amygdala, a primary center for emotions, and triggers the fight-or-flight response.

"(This pathway) can be activated in 2D but not nearly as strongly as in 3D," he said. "We are hard-wired to respond to stereo images."

This jibes with what "Journey to the Center of the Earth" producer Charlotte Huggins has long argued: that audiences don't "watch" 3D movies, they "experience" them.

Sandrew said the 3D image is more "significant" to the brain. "The immersive quality stimulates the sense of self in each person in the audience," said Sandrew. "Each person is experiencing a very personal experience, where in a 2D movie it's more of a group dynamic."

Sandrew is a Ph.D in neuroscience from SUNY Stonybrook who spent seven years doing brain research at Harvard. He was lured away to work on colorization by entrepreneurs "who offered me a package I couldn't refuse" and has moved on to 3D conversion.

Carstensen has an experimental psychology degree and was headed for a doctorate from the U. of Virginia before realizing "I had this goal of making the world a better place and the path I have chosen was not taking me there." But she's never stopped studying the field, even while working as a vfx producer on such pics as "Avatar."

Sandrew and Carstensen both warn of another effect of 3D: Viewers look around the frame more and look away from the actors more quickly. "Depth is a major distraction if you're trying to draw the audiences attention," said Sandrew.

On "Avatar," said Carstensen, Cameron had spent much time and effort creating the world of Pandora, so it helped the movie when auds' eyes wandered. But Sandrew recently showed a 3D test to a major director of a summer tentpole whose movie ended up not getting a 3D release. When he got to a two-shot, the helmer said "The depth takes away the intent of my direction."

It seems 3D shots need to be framed differently, with fewer objects in frame, to keep the aud's attention on actors. At the same time, though, Sandrew and Carstensen agree that actors can actually have more impact in 3D.

"Disparity is something the brain is expecting," says Sandrew. "It's closer to reality, so it has a deeper meaning, a more significant meaning to the observer."

By David S. Cohen, Variety

The Frame Compatible 3D World Cup

While the French football team left the South Africa FIFA World Cup under a cloud, France Telecom subsidiary GlobeCast has its head held high after a successful tournament in which it provided contribution for 28 live matches in 3D. The games were transmitted around the world by satellite, displayed to cinema audiences and also received by major broadcasters including ESPN in the USA, which launched its 3D channel ESPN 3D in time for the tournament.

Contribution from the stadia to the International Broadcast Centre (IBC) was performed as separate left eye and right eye streams using JPEG2000 compression. At the IBC, Sensio equipment converted two HD SDI signals into one HD SDI side-by-side (Frame Compatible) output and this was encoded with an Ericsson E5780 HD encoder in MPEG-2 at 40Mbps in 27MHz of bandwidth using DVB-S2 modulation, uplinked via Intelsat to London, with fibre back-up into London and Frankfurt. From London the signals were redistributed worldwide.

Because some cinema projectors could only work with 720p HD (Frame Compatible 3DTV) GlobeCast also decoded the MPEG-2 stream in London to convert the content into this HD format, taking care of some specific audio configurations as well. For the rest of the market the signal was delivered as 1080i50 HD. Broadcast/Pay TV customers for the GlobeCast 3D contribution feeds included ESPN in the US, Sky PerfecTV in Japan, CCTV in China, SBS in Australia, TF1 in France and Sogecable in Spain.

MPEG-2 was used for the contribution out of South Africa to cater for the needs of cinema networks who only had MPEG-2 receiver/decoders. And while the feed from the World Cup was in Frame Compatible 3DTV, Simon Farnsworth, Global Head of Contribution at GlobeCast, expects the broadcast contribution market to evolve so that it uses simultaneous left eye and right eye contribution signals, synchronized for live content, with both compressed in MPEG-4 AVC and transported as a two channel multiplex.

This will mean doubling the bandwidth needed for contribution compared to HDTV, although the use of MPEG-4 AVC will provide bandwidth efficiencies compared to MPEG-2 contribution.

GlobeCast has already demonstrated 3DTV contribution using two full HDTV streams. This approach was used in May for the live final of the French ‘Idols’ style reality TV show Nouvelle Star. Two hundred and fifty VIPs invited by broadcaster M6 to a cinema in Paris, as well as the subscribers to Orange’s experimental 3D channel with access to a 3D television, viewed the final output. The contribution used MPEG-4 AVC 4:2:2 10-bit compression in a solution developed in partnership with ATEME.

GlobeCast provides content management and worldwide transmission services for professional broadcast delivery. At IBC during September the company will be discussing its success in South Africa and other 3D projects. Over the past year, the company has also delivered the world’s first live 3D fashion show from Burberry’s catwalk at London Fashion Week to parties in Paris, New York, Dubai and Tokyo, as well as the final of the French Tennis Open.

3DTV will almost certainly be a feature of Ericsson’s IBC this year, as well. At NAB in April the company was discussing its solution for direct-to-home and contribution and distribution of 3D content. And the company is already providing ESPN 3D with a complete standards based video processing solution, featuring encoders and receivers tuned for ESPN 3D broadcasts, as well as for HD.

At NAB the company highlighted the Ericsson CExH42 MPEG-4 AVC HD Contribution Encoder, described as a natural platform for 3D contribution links, “ensuring full control of encoding parameters, exact synchronization and time-stamping of the compressed frames and the generation of a fully packaged 3D simulcast”.

By John Moulding, Videonet

Kerner Shifts Focus to 3D

A year after changing hands, Lucasfilm spinoff Kerner Technologies has reorganized to focus on 3D technology and production. Among its new initiatives: a production fund for financing 3D pictures.

As part of the reorg, Kerner has set aside or dropped some of its early efforts, including its FrameFree software, an "autostereo" monitor (3D without glasses) and plans for a 3D network via satellite TV.

Now to be known as the Kerner Group, the company owned by entrepreneur Eric Edmeades will have multiple divisions.

  • KernerFX is the new monicker for what had been Kerner Optical, the former models and miniatures shop of Industrial Light & Magic. "We'll stay focused on our specialty of destruction," Kerner Group prexy Tim Partridge said. Brian Gernand will oversee KernerFX with the title of senior creative director. Geoff Heron is practical effects supervisor. Company is adding limited digital vfx services to its offerings, mainly to enhance and work in tandem with its practical f/x work.

  • Kerner 3D Technologies refocuses the company's advanced electronics and rig-building efforts on stereoscopic 3D. The outfit is working on new Kernercam rigs for 3D capture and promises more developments in that area. Engineer Greg Beaumonte, also a longtime ILM vet, is co-designer of the Kernercam 3D system. Deployment of early Kernercam 3D systems remains limited, though the rig was used by David Arquette on his short The Butler's In Love, which will screen at the Hollyshorts fest at the DGA HQ in Los Angeles Aug. 5. Kerner will be showing its rig at the DGA's "Digital Day" on July 31.

  • Kerner Studios makes the soundstages and production gear on Kerner's San Rafael campus available for rent for shooting, particularly for 3D productions.

Other divisions: Commercial Production; Corporate and Government Research; Model and Miniature Design for nonentertainment clients; and a group dedicated to production of original 3D film and television projects.

"We currently have three projects that we're either producing or co-producing," Edmeades said.

The Kerner Group is in conversations with interested investors, Edmeades said, noting that investors may have the opportunity to put money into the business directly or into a 3D film fund. Kerner has shelved its work on FrameFree, a potentially revolutionary method for recording and distributing video, pending a re-org at the Japanese company that holds some underlying patents, said Edmeades.

Kerner Optical began as Industrial Light & Magic, but as ILM's digital business outgrew its practical f/x shop, George Lucas then sold the original ILM in a management buyout. That became Kerner. Kerner Technologies sprang up as the company branched out into advanced electronics.

Edmeades acquired a majority stake in the company and became CEO in 2009. Among the other key personnel for the Kerner Group are Partridge, who was hired by Edmeades after helping develop Dolby's 3D exhibition system, and executive producer Rose Duignan, a longtime ILM marketing vet and TV producer.

By David S. Cohen, Variety

3D Stereoscopic Editing in Premiere Pro CS5

Join Dave Helmly as he walks you through a complete workflow for 3D Stereoscopic editing in PremierePro CS5. In this 60 minute tour , Dave will cover various rigs, 3D viewing options, Realtime editing, and export.


Click to watch the video

Source: Adobe

IMAX to Form Strategic Partnership with Laser Light Engines

IMAX Corporation announced that it has signed a memorandum of understanding with Laser Light Engines, a leading developer and manufacturer of ultra-high brightness, laser-driven light sources. Under the terms of the agreement, IMAX plans to make an equity investment in Laser Light Engines, and Laser Light Engines would develop a custom version of its laser light technology for exclusive use in IMAX digital projection systems. Laser Light Engines would additionally provide outsourced research and development for new features designed to further enhance and distinguish The IMAX Experience for IMAX theatre operators, film studios and moviegoers.

As part of the parties' contemplated partnership, Laser Light Engines would offer its technology exclusively to IMAX for a period of two years and would not offer its technology to any other large format theatre systems for a period of three years. The memorandum of understanding signed by IMAX and Laser Light Engines gives the parties an exclusive period of time within which to reach final agreement on these and other terms.

Source: IMAX Corporation

DVB Publishes 3D Requirements

The Digital Video Broadcasting Project (DVB) steering board has approved commercial requirements for 3D-TV, with the group opting for the Frame Compatible Plano-Stereoscopic System. Plano-stereoscopic imaging systems deliver two images (L, R) that are arranged to be seen simultaneously, or near simultaneously, by the left and right eyes. Special glasses are usually needed by the viewer.

“The main implication for the requirement is that the L and R images must be arranged in a ‘spatial multiplex’ such that the resulting signal can be processed by the STB substantially as a conventional HDTV signal,” says the document. “Following ITU and other terminology, this is termed a Frame Compatible (FC) format.”

The document defines DVB 3D-TV commercial requirements for DVB members from the key industry groups utilising existing high-definition TV infrastructure. It also deals with issues such as subtitling of 3D content and graphics and text display.

While the group has gone for the Plano-Stereoscopic System, there could be room for another set of standards within the DVB Project. The document says: “Other DVB members have expressed the potential need for a set of standards that are appropriate to a different set of commercial requirements. These commercial requirements are in the process of being discussed and agreed, and they are not considered in this document.”

By Rose Major, Rapid TV News

Yankees-Mariners to Serve as Testing Ground for 3D Transmission

Like most live 3D telecasts thus far, this weekend’s 3D production of two New York Yankees-Seattle Mariners baseball games will be an exercise in experimentation. However, unlike previous sporting events broadcast in 3D, the YES Network-FSN Northwest production will be transmitted out to a total of eight distribution partners, the most of any 3D sports telecast yet.

On Wednesday, Blue Ridge Communications, Cablevision, Comcast, Cox, Service Electric Broadband Cable, and Time Warner Cable joined the previously announced DirecTV and Verizon FiOS as 3D carriers for the July 10 and 11 games. DirecTV will run fiber directly from Safeco Field in Seattle for transmission and use the satellite uplink as a backup; the other carriage partners will receive the feed via satellite.

“This is obviously very historic,” says John McKenna, chief engineer, YES Network. “It’s the first 3D baseball game that anybody has done, and we’re pretty proud to be involved in doing it. “It’s definitely a milestone in a lot of careers for the people involved.”

The 3D capture end of the production alone will be a massive undertaking, but the transmission side will be just as challenging. Two discrete left-eye/right-eye 720p 16×9 feeds will run into NEP’s SS31 3D production truck. Both will be fed into a Miranda Imagestore processor, which will insert YES 3D branding graphics and provide a full-screen YES 3D graphic to be displayed instead of black screen before the game and during commercial breaks.

From there, the signals will be fed into a SENSIO 3D encoder provided by PACE. The SENSIO encoder takes the 720p left- and right-eye signals and merges them into a 720p side-by-side picture. The newly merged signal is then sent through a Harris NetVX series encoder and on to the uplink truck. YES and FSN Northwest are using the AMC1 satellite and 18 Mbps bandwidth to distribute the signal to their affiliates. The feed will be transmitted directly from Seattle; YES headquarters in Stamford, CT, will not play a role in the transmission process.

“We don’t have 3D parallel or 3G capability here in Stamford,” says McKenna. “We had not expected at this point to need that kind of bandwidth yet. So we figure, it’s easier to go right from the site because there’s nothing we can do with it [in Stamford]. It’s all coming right from the site to the affiliates.”

The production team will run a “3D dress rehearsal” during tonight’s Yankees-Mariners game to gain some live-game insight and give distribution partners a chance to optimize their 3D capabilities.

“The [Friday-]night game will be a dress rehearsal for 3D, and we’ll put it up on the satellite for the affiliates to have a six-hour window to tune and tweak and make sure everything is working,” says McKenna. “Anything we glean from that, we’ll be able to apply to the next game or anything after that.”

The real deal will start on Saturday night, when the Yankees and Mariners take the field at 10 p.m. ET. Sunday’s game will begin at 4 p.m. ET. DirecTV and Panasonic will be presenting sponsors of the two 3D telecasts.

“It’s a very steep learning curve for everybody,” McKenna acknowledges. “We know that, when 3D is good, it’s great, but, when it’s bad, it’s absolutely horrible. We have to start off on the right foot.”

By Jason Dachman, Sports Video Group

Yankees, Mariners Step into 3D Spotlight

This will be another historic weekend in sports broadcasting: the YES Network, FSN Northwest, and DirecTV will broadcast two games between the New York Yankees and Seattle Mariners in 3D. For hardcore baseball fans, it signals a new day in enjoying the game at home, and, for hardcore sports-production professionals, it signals a new challenge: is baseball 3D-friendly?

“It’s a learning experience,” says Ed Delaney, VP of operations for YES Network. “We want to put the best broadcast we can on the air, and there is a lot to learn.”

NEP’s recently refurbished SS31 production unit will be on hand for the broadcast. Similar to its original incarnation, SS31 is a single-expando with a front-to-back control room and a Calrec Q2 audio console. The truck also features a Sony MVS-8000A production switcher, EVS XT[2] servers, support for 10 tape machines, and the ability to support 14 PACE 3D camera rigs in a variety of configurations. Chyron graphics will be used for the games, and a B unit will house the stereographer and convergence operator, with 3D expert/PACE CEO Vince Pace and his team on hand to oversee the production.

Tonight’s game will be produced in 3D as a full rehearsal to enable the team of nearly 40 personnel to iron out some of the kinks. Delaney says the technical side of the operation is in great shape, with six 3D camera positions to be complemented by 2D cameras whose signals pass through HDlogix 2D-to-3D converters.

“We did a test with HDlogix during spring training and had some success with it,” says Delaney.

Five hard cameras will be in place: one each at low home, low first, low third, centerfield, and high home. A sixth camera, a handheld 3D beam-splitter rig, will also be used for game coverage. And shots in the announce booth will be captured with the Panasonic AG-3DA1 3D camcorder, a unit that is quickly earning the nickname “Wall-E,” given its uncanny resemblance to the animated robot of Pixar fame.

Delaney expects the shots from low home to deliver the best 3D impact. The high-home camera position, however, remains a concern because it will be shooting through the screen behind home plate. During tonight’s rehearsal, the quality of those images will be evaluated, and, if necessary, the camera will be moved to an alternative position that is not behind the net.

“We will learn a lot during the rehearsal,” says Delaney.

The biggest challenge during the broadcast will rest on the shoulders of the director since baseball broadcasts typically involve a lot of cutting from one camera to the next.

“The pace of the game and how it is cut is an issue, and it isn’t going to go away,” says Delaney. “It isn’t cut like football, hockey, or basketball, where the action goes side to side and you can stay wide with a shot. Baseball is about cut-cut-cut. With 3D, that can blow people’s heads off. So that is going to be the biggest learning curve.”

Delaney is already assembling a 3D wish list for future broadcasts, including the desire for a low camera position that is closer to the dugout and can give the viewer the sense of watching the game from the top of the dugout: “It would be like having the best seat in the house.”

By Ken Kerschbaumer, Sports Video Group

Telekom to Offer 3D via ADSL

German telco Deutsche Telekom will add 3D content to the digital video library of its IPTV platform Entertain. The new feature will be launched in early September during international consumer electronics fair IFA in Berlin, a Telekom spokesman said in Hamburg.

At first, only 3D movies will be offered, but games from domestic soccer league Bundesliga will be added later. Negotiations are already underway with national soccer association DFL, the spokesman said. No details were given on the subscription costs.

For 3D reception, customers need a 3D TV set with 3D glasses. Telekom‘s IPTV box is already capable of handling 3D signals.

On 7 May, Telekom entered the 3D world for the first time: The telco transmitted the opening game of the ice-hockey world cup in Gelsenkirchen live via Entertain, marking the first time a sports event from Germany was broadcast live in 3D on television.

By Jörn Kriege, Rapid TV News

Astro Decides on 3D World Cup Final

Malaysia’s Astro All Asia Networks will air its first 3D TV offering this weekend, with Sunday’s FIFA World Cup Final to be shown in 3D, in collaboration with Sony Malaysia. The 3D coverage of the final between the Netherlands and Spain follows a live 3D telecast of the Netherlands vs Uruguay semi-final for invited guests on Wednesday morning.

Subscribers with the new Astro B.yond set-top and a sports package subscription will be able to watch in 3D – if, of course, they are also equipped with a 3D-compatible TV. Astro claims to be the first broadcaster in south-east Asia to offer 3D content.

By Rose Major, Rapid TV News

How to Surf the 3D Movie Wave?

Investors looking to cash in on the 3D craze take note: rather than betting on a volatile box office, your best bet might be with companies that make watching and screening movies in three dimensions possible.

With 3D all the rage at the local cineplex -- thanks largely to James Cameron's Avatar -- RealD, the top supplier of U.S. 3D theater projection gear, is set to debut on July 16 with strong expectations for its initial public offering.

Analysts advise buying into infrastructure companies like RealD for now, rather than bet on box office fortunes or the theater chains sinking billions in upgrading to carry 3D.

"The 3D market is an embryonic growth market right now. If you get in early with a company that dominates the segment, you could do well," said Francis Gaskins, president of IPOdesktop.com.

Analysts spy hidden gems among smaller hardware and services vendors such as RealD. They favored companies that focus on hardware to aid the upgrade or film-conversion to 3D, or provide technology to screen it.

Media Valuation Partners principal Larry Gerbrandt likened the boom to high-definition TV, when the likes of Technicolor SA's Grass Valley -- which made broadcast switches needed for the transition -- became early beneficiaries.

"It's more of an infrastructure play. You can't point to any one company and say they're going to be 'the' 3D play," he said. "It's one of these things where, over time, the whole infrastructure gets upgraded to 3D."

The 3D film business has mushroomed since Avatar became the highest-grossing movie of all time. Hollywood is cramming its release schedule with high-profile films like Toy Story 3. And theaters are scrambling to upgrade screens for 3D -- an estimated $3 billion exercise in North America alone.

The question is where to invest.

Marla Backer, analyst with Hudson Square Research, likes Ballantyne Strong Inc -- which prepares theaters for digital projection. That's despite the Omaha, Nebraska, company's stock price more than tripling to $7.50 from $2.20 in the past year -- to a pricey 29 times estimated 2010 earnings.

Carmike Cinemas Inc, a small theater chain, has a greater proportion of 3D screens than larger competitors -- about 500 out of 2,200 -- and would thus be a disproportionate beneficiary of the 3D craze.

And she likes IMAX Corp, with its seven-storey screens, as a popular showcase for such movies.

Chinks in the Armor
Less certain is whether big-name theater chains and studios actually make money, longer-term, off this emerging technology. Exhibitors like Regal Entertainment have charged up to $5 on top of a standard ticket for 3D films. But some analysts suggest 3D is already showing signs of weakness.

BTIG Research analyst Richard Greenfield said box office revenue from 3D screenings is already declining. Walt Disney Co's Toy Story 3 made 60 percent of its opening weekend gross from 3D, compared with 70 percent for Alice In Wonderland"in March.

"With the economy still recovering, we worry that movie exhibitors' view of consumer demand for 3D is disconnected from reality," he wrote.

Infrastructure may be a safer bet. Of some 130,000 screens globally, only 10,000 are 3D enabled. More than 5,000 theaters carry RealD 3D equipment and it said it has deals to convert another 5,000.

If RealD's IPO take-up is strong, that might encourage others. Some are already plugging rivals such as X6D Limited, which sells under the Xpand brand, and Burbank, California-based MasterImage 3D.

"RealD is the biggest by far in the U.S., but I'd say MasterImage and Xpand will give RealD a run on the global business and will make inroads in the market domestically," said Scott Hettrick, editor of 3DHollywood.net.

RealD competes with audio equipment maker Dolby Laboratories Inc, which also sells 3D projection systems for theaters. Eric Cohen, corporate development vice president for Dolby, told a tech conference in New York that the company's 3D systems are installed in about 3,300 theaters worldwide.

Burbank-based 3ality Digital is a maker of 3D cameras that industry sources also peg as an IPO candidate.

"Business is growing at an extraordinary rate and we expect to be profitable in 2010," 3ality CEO Sandy Climan said.

Conversion -- A Big Play
Outfits that convert regular film and television shows into 3D may be another choice investment, analysts said. One expert pegged the market at $35 billion in the next five years.

Filming live-action 3D is costly and untested, apart from Avatar. Conversion is more of a known quantity and can fill a home entertainment market for 3D content. IMS Research expects over 200 million 3D-equipped TV sets to be shipped by 2015, and Hollywood cannot turn out new movies fast enough.

Conversion costs range from $50,000 to $150,000 a minute per film, depending on the visual challenges involved.

Prime Focus, a company started in India that has grown to 1,200 employees and become one of the largest players in 3D conversion, was stung by bad reviews for its job on this year's Clash of the Titans. Its London share price fell 4.5 percent after the movie's release, but has since bounced back.

Other major players in the arena are In-Three Inc, Legend 3D and Sony Corp's Imageworks. All worked on Alice.

"The conversion companies will do a big business over this next period," said Michael Peyser, a movie producer who teaches in the University of Southern California school of cinema.

Rob Hummel, Prime Focus' post-production chief, said his phone is already ringing off the hook.

"I want to get us so we never have to say no," Hummel said. "We tell them, 'You better book us soon because if you don't book us, someone else will.'"

By Alex Dobuzinskis and Sue Zeidler, Reuters

All Mobile Video Looks to Harness the Future of 3D

As the demand for live sports and entertainment productions to be captured in 3D continues to increase, compatible facilities to support this are now becoming readily available. Major sports productions like the FIFA World Cup tournament as well as a number of entertainment events in theatres are driving this demand.

All Mobile Video (AMV), a veteran mobile production company based in New York City that has been at the forefront of a number of industry transitions (they built the first all-digital, standard definition truck ‘Celebrity’ in 1998), is meeting the challenge with a new 3D-centric truck that promises to not only help producers bring 3D to homes and theatres across the United States, but will also serve as a teaching tool.

Called Epic 3D, the new 54ft-long ‘test bed’ expands along 47ft of its curb-side and 33ft on the roadside to more than 15ft wide, and can support standard HD recording with the improved colour recording of the highest quality 4:4:4 colour sampling at multiple frame rates. In a standard high definition 4:2:2 recording, half of the colour information seen is discarded before recording. Using 4:4:4 colour sampling, more colour information is captured, and thus the resulting image is that much more brilliant, making for a captivating 3D experience.

What makes Epic 3D unique, among the handful of similar trucks now becoming available from a number of production companies, is that it was built from the ground up to be 3D-capable, handling the entire production chain, from acquisition through to transmission.

“The idea with Epic 3D is that it is a complete turnkey solution for 3D acquisition,” says Eric Duke, president of AMV. “You won’t need any other vehicle to produce and distribute the finest 3D available today.”

When it hits the road this month, Epic 3D will carry a full complement of Sony HDCAM SR (4:4:4 processing) recording equipment, 18 Sony HDC 1500 HD cameras, six camera rigs made by 3Ality Digital (both side-by-side and beam splitter rigs, each holding two Sony HD cameras and including computer assisted convergence capability), a Sony MVS-8000X production switcher, and a Studer Vista 8 Digital audio console (250 input, 62 fader) for producing full 7.1 surround sound mixes.

On-board Signal Monitoring
The new truck includes a spacious Production area, where the front row accommodates the director, TD and producer, while a back row features six ‘convergence’ positions, one for each 3D camera rig in use. The front wall features 16 Sony LCD displays running off a number of Kaleido-X16 3D multi-viewers from Miranda Technologies.

These monitor wall displays combine stereoscopic 3D sources coming from dual 1.5 Gbps signals from each camera rig (the truck is outfitted with a redundant 3 Gbps infrastructure), and it can also show frame-compatible formats — left and right images side by side, or top and bottom — in a single HD window. Stereoscopic 3D sources can be combined on the same display with HD and SD sources, with full flexibility regarding layout configuration. Each area of the truck, namely Engineering, Production, Audio, Camera Shading and Tape, has its own Kaleido-X16 multi-viewer and layout. The multi-viewer outputs are connected directly to Sony stereoscopic 3D LCD displays, hung in both portrait and landscape orientation.

The Kaleido-X16 multi-viewers allow up to 128 inputs to be displayed across the front wall inside the truck. The control and layout characteristics in stereoscopic 3D are the same as with 2D video, and the Kaleido-X16 can display 3D and 2D signals simultaneously. In addition, HD-SDI outputs from each multiviewer can provide a copy of the multi-viewer display, which can be fed to the router for distribution throughout the truck, and outside for commentary or other requirements. By using high quality stereoscopic 3D monitoring during production, AMV’s convergence crew can fully assess the quality and compare perspectives before switching between two stereoscopic 3D cameras.

In addition to the stereoscopic 3D multiviewers, AMV has also installed interfacing equipment from Miranda, including the new 3DX-3901 3D signal processor, used to convert among the many types of stereoscopic 3D signals on board the truck. The processor also provides correction and electronic alignment of images coming from the 3Ality Digital 3D rigs. When the truck is used for 2D events, the 3DX-3901 can be used as a traditional 3Gbps/HD frame sync and up/down/cross converter, avoiding the cost of extra equipment in the truck.

“We really like what Miranda has developed in terms of 3D monitoring and signal processing,” Duke says, adding that the crew will wear polarised glasses during a production to view the converged 3D sources. “Our crews love the Kaleido-X16 multi-viewers because they are so flexible and can be set up differently for different directors, depending upon how they like to work.”

Making 3D Practical
In order to save clients money, AMV engineers are working on a solution that will enable the truck to produce projects simulcast in both 3D and 1080i or 1080p HD.

“We don’t know yet how it’s going to work for every show, but we will have a solution that will be cost-effective for producers.”

One idea is to have a second production switcher on board (on the back row) to handle the 2D show, using left or right eye only camera sources. However, this adds cost to the client, and 3D production is expensive when compared to a traditional HD project.

To be cost-effective for AMV, the truck had to support the multitude of signal types now requested in HD, plus the new stereoscopic formats. This includes: 1920x1080 resolution up to 60P (progressive frames) per second and standard 4:2:2 colour sampling for HD and stereoscopic 3D (S3D) 1920x1080 progressive or interlaced HD production at 23.98p, 24p, 29.97p, 50i and 60i frames per second; Dual-link (left eye/right eye all to one tape) with standard 4:2:2 colour space sampling to HDCAM SR tape; and the 720/60p HD format.

“Currently, the cost of doing a 3D production is rather high, even by early HD standards,” Duke says. “When we went from SD to HD, maybe we added one more person to the truck. With 3D,we’re adding one person per rig, plus a stereographer, plus a processing engineer. So you could be up to ten people on top of the standard crew that is necessary to produce a standard HD event. And if we do a 2D/3D simulcast, we need a second production switcher and TD. There’s a big difference between 2D and 3D in terms of acquisition and making it all fit into an overall production that today’s viewers have come to expect.”

AMV is also dealing with the challenges of the cumbersome 3D rigs and how they can be positioned to get the most benefit for viewers at home without taking up too many seats within a venue. A 3D hockey game produced and televised by Madison Square Garden Network, in New York City, in March reportedly ‘lost’ about 700 seats to 3D camera positions.

“For years we’re been striving to make HD camera positions smaller and smaller, now we have these large 3D rigs, which must be placed closer to the field,” Duke says. “Now, we have to figure out how to make it work for everyone, because if you take money away from the house, they won’t be so receptive to accommodating a 3D production. That’s the only way this will succeed.”

For now, AMV is excited to get the truck in action and see what happens. Apparently, so is the rest of the industry, which is looking at what AMV is doing to plan their next moves and learn from their experiences. Everyone agrees that 3D production is still in the development stage and events like the FIFA World Cup will go a long way to working out the kinks.

AMV is also outfitting one of its studios in New York for 3D because it feels it can manage the signal processing in the studio a lot easier.

“We tend to go into new territory ahead of everyone. We try to set the standard and stay ahead of the curve. No one knows how 3D is going to be received in the marketplace but we see a great future for 3D projects.”

By By Michael Grottecelli, TVB Europe

Orange Opens with 3D

With a content business growing in the double digits each year, and a stated aim to be more than just a utility to its customers, Orange is one of the few IPTV operators to be taking 3DTV seriously enough to be putting it into operation now — rather than adopting a ‘wait and see’ approach. However the launch of its new 3DTV channel is not without caveats.

“We don’t have enough content to launch a full 3DTV channel and that’s why we are calling it a dedicated 3D service,” explains Ghislaine Le Rhun Gautier, Orange 3D project director. “The objective is to show occasional events such as football, live performances like circus or ballet, documentaries and a 3D promo reel as the market builds.”

Movies will come later after the rights to air new 3D feature film content come to VoD after their release on Blu-ray.

The telco’s 8.9 million broadband customers in France have the potential to view this 3D content for free, delivered via DSL and fibre, so long as they have a 3DTV and at least 8Mbps which pretty much excludes those without the higher downstream speeds facilitated by fibre connections.

Raoul Roverato, executive vice-president of New Growth Businesses at Orange indicated that of the other markets where Orange is present, Poland could be the next target for 3D rollout. This won’t happen this year he said, but suggested that the Euro 2012 football tournament hosted in Poland and the Ukraine could be the perfect launch pad for such a channel. Importantly for the company it has beat rival Canal+ to market. Canal+ signalled its intent to launch around Christmas 2010.

The channel’s debut coincided with coverage of the French Open Tennis Grand Slam. With all the matches on the main Philippe Chatrier court aired this laid claim to be the first European multi-day sporting tournament to be broadcast live in 3D.

3D Operation at Roland Garros
Intriguingly the workflow applied by Orange, in partnership with host broadcaster France Télévisions, through the UK’s Can Communicate, was virtually identical to that planned for the stereoscopic broadcast of the World Cup in South Africa. Perhaps that’s no surprise given that Can are technical consultants to both 3D projects. Orange said that they had worked with NHK and 3ality on previous trials.

At Roland Garros the court was covered with five 3D camera positions, four of them featuring HDC1500s and Canon lenses mounted on Element Technica Quasar rigs.

“We had three mirror rigs shooting through with a full body 1500 and under the mirror with a T-block configuration,” explains Can Communicate stereographer Richard Hindley who is one of two lead stereographers for HBS in South Africa. “Another Quasar was arrayed side by side overlooking the court where there’s more room.”

The signals were each fed through an MPE-200 3D Processor for lens alignment with the images viewed on monochrome monitor by one of four convergence pullers. In the dedicated 3D truck, Hindley and Can Communicate colleague John Perry monitored the output, directed the convergence pullers and advised the director. A pair of EVS machines provided 3D replays, post correction by the stereographers.

“We were working within a 2.5% depth budget for shots pulled behind the screen plane and about .5% for shots in negative parrallax,” explains Hindley. “We have to be a little conservative because we are conscious of consistency for both small and large screens.”

The men’s final was broadcast live in 3D to select cinemas by the Federation Francaise de Tennis and France Televisions though Orange was not involved.

All that technology is the same as is being applied to the World Cup 3D coverage. The one significant difference was the use of Panasonic’s twin-lens AG-3DA1 integrated 3D camcorder. Orange Labs’ R&D division had been playing with prototypes for a month and originally intended to test on ENG material at Roland Garros but ended up placing it as one of the five main courtside positions.

“We worked with Panasonic to develop new functions for the camera,” explains Orange TV’s Jerome Fournier. “These include an in-camera notification to the camera-operator of the limits within which they could pull focus on foreground and background objects without creating eye discomfort.”

A tracking shot using the lightweight camcorder suspended from an aerial cable tracking 28m above the court was also trialed.

“We were surprised by how good these were considering the movement on the cable,”says Roverato. “We were impressed by the stability and clarity of the images which is why we used them for some of the live action.”

Nonetheless aerial shots during the tournament were converted from 2D from a conventional HD aerialcam using JVC’s IF-2D3D1 box.

The on-court AG-3DA1 was positioned facing a VIP section of the crowd, capturing cut-away reaction, because it was not initially deemed suitable for coverage of the match in action. However during the second week of the tournament the crew did experiment with shots of the on-court action using the AG-3DA1 and some of these were transmitted.

“We had some problems matching the colorimetry on the Panasonic with that of the other four cameras, but other than that it’s worked really well,” Hindley says.

The broadcast was transmitted in a side-by-side configuration at 1080i. Japanese broadcaster Wowwow and Al Jazeera also took the 3D feed from Orange.

“We are investing €2 billion between now and 2015 on deployment of 100Mbps FTTH and we associate 3D with that superior experience,” adds Le Rhun Gautier. “We are an innovating group and in 3D we have been pioneers.”

Orange owns rights to French Ligue 1 and delivered a live broadcast in 3D of the Ligue 1 match between OL and PSG last year, beating BSkyB’s live 3D football premiere by a year. However Roverato said that Ligue 1 would not be covered in 3D next season because the costs were prohibitive.

“3DTV is at least five years from becoming mainstream,” he says. “The new channel is more of a marketing initiative to show the platform’s potential and accustom consumers to the idea of watching content in 3D.”

He added that Orange is investing a few million euros each year in commissioning and delivering 3D content. It may also swap content with BSkyB.

“They have content, we need content and they may want some of ours so it makes sense for us to co-operate to create a new market by exchanging content,” says Le Rhun Gautier. “It’s Important for us to be in the market now in order to try and create the market by bringing live events and other highlights to people and showing them the difference 3D can make.”

By Adrian Pennington, TVB Europe

Broadcasters Prefer Service Compatible 3D Format

Over half of broadcasters would prefer the adoption of a Service Compatible format for 3DTV, marking a split with the side-by-side transmissions favoured by pay-TV operators, according to a survey conducted by the European Broadcasting Union (EBU) and seen by Broadband TV News. The survey asked EBU members, largely drawn from broadcasters with public service obligations, which of the systems currently under development best met their potential needs if and when the decision was taken to broadcast in 3DTV.

The Service Compatible system that needs both a new display and a new set-top box but can be viewed in 2D on a normal receiver, and gives the highest 3D quality possible today received interest from 51.9% of respondents. It works by adding additional information to the signal that is only decoded by a 3DTV, allowing regular viewers to watch their programme of choice, without the need for 3D audiences to retune or broadcasters use substantially more bandwidth.

The Frame Compatible system – needing a new display but not a new set-top box, and provides somewhat higher quality – effectively that being deployed in Europe and around the world by the pay-TV sector– achieved 11.1%. This was the same score as Colour Anaglyph, the 1950s throwback that has been used more recently by Channel 4 and Virgin Media in their 3D seasons.

25.9% of respondents said none of the systems met their needs.

Service Compatible is most likely to be used in the DTT horizontal market for occasional broadcasts, offering a higher image quality than used by pay-TV through the half resolution per eye/view frame compatible method of side-by-side. The transmission company Arqiva recently confirmed plans to run live 3D trials on the UK terrestrial system.

The EBU is one of a number of organizations working on the development of 3D standards. The DVB is developing the broadcast signal formats for 3DTV, SMPTE is developing a ‘file format’ for 3DTV production, while IEC/ISO are concentrating on compression systems for 3DTV delivery. Other work includes the 3D@Home consortium that is looking into ways that one pair of glasses purchased for one manufacturer’s product could work with another. The European Broadcasting Union (EBU) itself is looking at broadcast requirements. All these will ultimately fall under the ITU-R that will make recommendations for the use of the format.

By Julian Clover, Broadband TV News

Vet Filmmaker's Aim is 3D for the Masses

Randal Kleiser, best known for such iconic 1970s hits as Grease and The Blue Lagoon, has become a passionate advocate of stereo 3D who hopes to spread to mobile devices. But Kleiser is no recent convert: He was actually part of the team of directors, including James Cameron and George Lucas, who came to ShoWest to promote digital 3D in 2005, when their passion still seemed something quixotic.

A principal in 3D technology startup CubicVue, Kleiser said he's been fascinated by 3D ever since he saw the Ping-Pong sequence in 1953's House of Wax when he was a kid. But the first opportunity to work in 3D came to Kleiser, who directed the 1992 sequel Honey, I Blew Up the Kid, when Disney asked him to direct a 70mm 3D attraction Honey, I Shrunk the Audience that ran in theme parks for more than a decade.

Lensed by Dean Cundey (Who Framed Roger Rabbit) with two interlocked 70mm cameras, the film was shown in a theater with 600 seats "mounted on a computer-controlled platform that could move in sync to the screen, and various devices built into each seat that would stimulate the viewers' senses," Kleiser said.

"The concept was that the audience was experiencing a live stage event," he added. "To create this effect, we calibrated the distance of filming and projection so that an actor's onscreen image would be life-size on the 54-foot-wide screen. To give the impression that everything was live, we could not have the intrusion of film grain. Because a beam splitter was used for the second camera, an entire stop was lost. The movie was shot with a huge amount of light on 5248 (color negative) film to give it a sharp quality. In preparing to shoot the film, we screened every 3D movie we could find to analyze what worked and what didn't," he added. "It soon became clear that this was another world with its own techniques, terminology, rules, problems and choices."

So Kleiser came to study the grammar of 3D, becoming well-versed in topics such as convergence points and the causes of eye strain.

"Cutting patterns in 3D greatly affect the viewer's comfort," he said. "Every time there is a cut, the eyes have to readjust to the new planes of depth. This is a physical effort by the viewer's eye muscles each time the scene changes. A fast cut sequence would send most people screaming for the exits. In Honey, since we were trying to create the feeling of a live event, there are no apparent cuts. The eyes never have to adjust to a new plane; the 3D effects happen within the proscenium.

"One of the most distracting phenomena in the 3D format is when an image 'breaks the frame,' that is, when it appears to be closer than the plane of the screen but cut off by it," he continued. "For truly controlled and effective 3D, the screen needs to be thought of as a window. Any image can be inside the window, or come through the window, but can't break through the window frame."

Kleiser's interest in 3D took the director to various technology conferences.

"At one of these I met inventor Michael Mehrle, who showed me an iPod he had converted so it showed 3D without glasses. I was blown away. It was basically a piece of plastic with thin lines of color that sent one image to each eye. It was like the screen was wearing the glasses instead of the viewer. His process uses magenta and green as the two colors that, when viewed, are merged into full color by the brain." (The system could be developed to support other color systems.)

With this development, Kleiser, Mehrle and several additional partners formed CubicVue to bring this technology to consumers with an appetite for 3D. The CubicVue technology essentially displays glasses-free 3D on a cell phone, tablet, game console, media player or other portable flat-screen device.

"Now we are searching for the perfect partner to take this technology to the next level," Kleiser said, adding that CubicVue is talking with potential partners for licensing, manufacturing and distribution of the technology. The aim is to have product on the market before the end of the year.

The technology uses a patent-pending color filter that could either be embedded into handheld technologies or layered over screens on existing devices. Mehrle said the technology is capable of supporting full-resolution imagery to each eye (1280x1024), and theoretically could be applied to any size screen, though small screens are practical and thus the company's focus. CubicVue is aiming to create the standalone filters for a retail price of less than $50.

Mehrle said the system could be used to view any stereo content that was encoded with a free open source code, which the company will make available.

"The 'standard' that we support is already available on YouTube," Mehrle said, explaining that there is therefore a wide range of 3D content on the Internet that is already encoded for use with the CubicVue system.

"Available 3D content is growing by the day. But I think gaming will be the killer app in the future, because it is interactive," Mehrle said. "Games can be very quickly supported. And if a game studio decides to adopt our system, all of their content will be automatically in stereoscopic 3D."

While serving as chief industry liaison for CubicVue, Kleiser also is in discussions about directing some new 3D productions and monitoring 2D to 3D conversion opportunities for some of his films, such as his classic Grease (whose sing-along version opens July 8 in theaters).

"It needs to get a little better before I do that," the helmer said of the conversion process. "I don't think conversion techniques are there yet. It can be effective, and it can be not so effective with the current technology. Sometimes I think it is better not to do it at this point. Sometime it is distracting because the current technology is not up to speed to really make it flawlessly invisible. I imagine that will change in a year or two."

Kleiser also shared his thoughts about uses of 3D. "I'm not that interested in seeing My Dinner With Andre in 3D, although some people say they would. There are certain movies that don't need it."

Kleiser said that once the CubicVue technology is released, he would also like to create specific content for the portable platform.

"There aren't more autostereoscopic screens on the market right now because they are not very friendly for the user," said Rob Auten, CubicVue's chief development officer. "They are very particular. They are either very expensive or require complicated integration from the manufacturing level. We are -- with technology that is around right now -- trying to create something that is easy for the consumer to apply or for the manufacturer to integrate and that doesn't require a new panel.

"Our base product is very simple," he said. "We are hoping simplicity and a higher-image fidelity will provide an exciting product."

By Carolyn Giardina, The Hollywood Reporter

Eutelsat Satellites Beam 2010 FIFA World Cup in 3D to Cinemas Across Europe

Working with key players in 3D development, including Sony, Eutelsat has optimised the FIFA World Cup as a commercial platform for 3D viewing in out-of-home venues. Seventeen of the 60 matches played so far in South Africa have been transmitted in 3D across Europe through Eutelsat satellites and shown on 50-foot cinema screens in 19 countries, including France, Germany, Italy, Spain, Russia, Poland, Nordic countries and the Baltics. The four semi-final and final matches will also be broadcast in 3D, with cinemas in additional countries, such as the Netherlands, taking the signal, underscoring the scalability of a satellite-based network.

Eutelsat calculates that over 250 hours of 3D transmissions will have been transported by its satellites by the time the FIFA World Cup ends on July 11. Signals are broadcast in Europe through its ATLANTIC BIRD 3 and W7 satellites, using 40 Mbps of throughput to ensure both the quality and the robustness of each transmission. Eutelsat is running five feeds: four configured for cinemas, with English, Italian, French, Russian commentary, and one TV signal operated by the French broadcaster TF1, which is available in France in the FRANSAT digital platform.

The FIFA World Cup production in 3D is managed by the event's appointed host broadcaster, HBS, using Sony technology. The content is delivered by GlobeCast via W2A to Eutelsat's teleport near Paris, where it is retransmitted to ATLANTIC BIRD 3 for Western and Central Europe. A second teleport in Moscow ensures distribution via W7 in Russia.

Over 200 of an expanding network of more than 400 digital cinemas are enabled to receive the live 3D signals using equipment provided, installed and managed in real time by Eutelsat in collaboration with OpenSky. The equipment comprises a 1.5 metre receive antenna and a professional IDC receiver with Sensio decoding and BISS decryption.

Source: Eutelsat

3D Production: FIFA World Cup 2010

In December 2009, FIFA and Sony announced plans for 3D coverage of 25 FIFA World Cup matches. Integral in the selection and adoption of the technologies were Peter Angell, HBS director of production & programming, who served as FIFA special 3D project leader, and Duncan Humphreys, 3D consultant to HBS for the World Cup and partner in UK-based 3D production company Can Communicate.

The aim was to deliver to TV-viewing soccer fans a new experience of their sports and to cheer even more about the World Cup. The tournament should help to kick off what TV makers, networks – and advertisers – hope to become a new dimension in home sports viewing and sports viewing in cinemas. Just as high-definition TV improved sports viewing by adding a sharper, wider field of vision, 3D adds depth to the field, increasing the illusion that you are watching the event in person but closer to the action.

For the 3D production of the 2010 World Cup, Sony has developed a 3D platform that combines processor, switcher, lenses and camera rigs. The company’s system integration facility in Basingstoke was fitting a 3D layer onto the T16 HD truck from UK outside broadcast supplier Telegenic as well as onto the Car8 HD truck from the French production company AMP. Both units were air-freighted directly to South Africa with Antonov aero planes. The AMP truck handles productions in two stadiums, Ellis Park and Soccer City in Johannesburg while the Telegenic unit does matches in Durban, Cape Town, and Port Elizabeth.


AMP Car8 being air-freighted with an Antonov to South Africa


The 3D Layer OBVans from AMP and Telegenic
The Sony MVS-8000 production switcher in the AMP Car8 and in the Telegenic T16 were upgraded with a 3D software package and in the monitor wall 24- and 42-inch LMD series 3D displays were installed. PVM 23-inch monitors were utilized to view camera setup and stereo channel balance. A convergence area was implemented where eight MPE-200 multi-image processors with MPES-3D01 stereo image processing software help the convergence engineers to maintain camera alignment and to control the rigs. Each box takes in two video streams along with lens metadata from camera left and right outputs and provides electronic picture correction. It can correct horizontal and vertical image shift, toe-in correction, tilt and rotation, zoom synchronization, color misalignment and any inversions caused by the use of mirror rigs.


3D Camera Layer with MPE-200 (click to enlarge)


The MPE-200 is designed to give outside broadcasters the ability to produce quality 3D without necessarily the expense and additional time required for use of fully motorized rigs. The box calibrates the optical centers of the two lenses throughout the entire zoom range. After alignment, a convergence operator can set the required interaxial distance of the rig, and the software will calculate and correct for any misalignment during production. He can also monitor and adjust the signals to ensure they do not go beyond the depth budget boundaries.


MPES-3D01 stereo image processing software (click to enlarge)


At the FIFA World Cup the HDC-1500 cameras are working with Canon HJ22ex7.6B lenses, but Fujinon lenses are likely to be supported as well. Same is true for the rigs: At the World Cup the MPE-200 is controlling 3D rigs from Element Technica, however there is no reason why it won’t be able to control 3ality-, P+S- or Swiss-rigs at some point. The product combines hardware based on the Sony Cell processor found in the PS3 and 3D software developed at the Sony R&D center in Basingstoke.

In later versions of the software it is also planned to deliver enhanced graphics manipulation and digital effects. The processor includes a histogram displaying how much convergence is being pulled and also provides a variety of 3-D monitoring methods, including 50 percent mix, above/below, anaglyph, difference and side by side. To each camera pair a convergence puller is assigned, responsible for alignment, set up of cameras and rig, and pulling convergence live.


Telegenic T16 3D Layer
The vast majority of coverage is captured in native 3D by mixing the signals from the eight HDC-1500 pairs, but certain shots are up-converted in order to deliver the best possible presentation of the action. Peter Angell explains: “Our primary goal is to tell the story of the match as well as possible, but that doesn’t mean littering the coverage with 2D shots. If there’s a particular incident which has only been captured on a 2D camera, or a 2D camera has the best angle, then, editorially, that shot is critical to the story, and we would be penalizing the viewer if that weren’t included.”


Telegenic T16 (click to enlarge)


After weeks of tests in the run up to the World Cup, the team finally decided to go with JVC’s IF-2D2D1 2D to 3D conversion box, although some specific shots being adapted for 3D using the Sony MVS-8000 vision mixer. The video effects function of the MVS-8000 switcher can be used to split the image and create, what Angell terms, a “pseudo-3D” image from 2D cameras.


Convergence operator (click to enlarge)


AMP Car8 3D Layer
For the inclusion of 3D live graphics HBS is working with FIFA’s graphics supplier deltatre for the positioning of graphics on the Z-axis depending on the shot selection. In South Africa both outside broadcast vehicles are housing XT[2]+ servers which enable dual feeds to be recorded and played back instantly in full timecode synchronization.

The combination of hardware and software (XT[2]+ and MulticamLSM) renders all existing capabilities of MulticamLSM available for live 3D productions including instant replay, loop recording, live clipping, playlist management, live slow motion, cuing and highlight editing.

Each of the two 3D OBVans is housing six XT[2]+ servers under control of an LSM remote device for the production of the matches. All the 25 matches are recorded in HDCAM SR (SRW-5800) with full bandwidth left- and right-eye signals on a single tape.


AMP Car8 (click to enlarge)



Production area (click to enlarge)



Convergence area (click to enlarge)


The Positioning of the 3D Cameras in the Stadiums
For coverage of all the 25 3D games in South Africa the Quasar rigs from Element Technica are used with Sony’s HDC-1500 cameras and Canon HJ22ex7.6B zoom lenses. There are 16 Quasar rigs in total, eight with each of the two 3D OBVans from AMP and Telegenic.

Each game is covered with eight camera pairs, with four positioned on slightly lower main camera shooting platforms (compared to 2D) and four at the field level. In each stadium there are four positions for Quasar side-by-side configurations: Main camera wide, main camera tight, and goal line left/right. The side by side camera pairs are far enough away from the action that they are not likely to converge.

There is nothing coming very close and there are no deep-background elements as opposed to the positions on the pitch where the action might be 5-20 meters away with deep background elements in between 50-100 meters. Therefore four under/thru configurations are positioned at bench left/right and behind the goal left/right. These are mirror rigs because the action gets close to them. The cameras have to get closer together than the side by side ones. The under/thru rigs also allow the cameramen to have a full viewfinder and the lens controls in the back as normal.


3D Production Camera Plan (click to enlarge)


Any camera operator can go to the back of the camera and know how it works. With the level of integration through the Quasar rigs, the Canon lenses and the Sony cameras with the Sony MPE-200 processor box and the CCU, there is one fibre running from the truck to the rig, so all image data, communications, rig control, lens control, metadata — everything including power — is now going through a single SMPTE fibre. For instance, interocular and convergence can now be set with the Sony box or locally at the rigs. And full metadata information is available to the convergence pullers and stereographers in the 3D OBVan for analysis of the two images.

The camera positions in each of the five 3D venues (Durban, Cape Town, Port Elizabeth and Johannesburg with Ellis Park and Soccer City) are pretty identical and similar positions make it easier for the production crews to work in different venues without having to massively adjust production philosophies.


AMP 3D Cameras at Ellis Park
The ability to quickly break down and set up the 3D production gear has been almost as important as the new skills related to the production. The team quickly realized that color coding all of the equipment for a given rig made it much easier to ship and assemble. About an hour of optical adjustment is all that is required to get the cameras ready for the match.

Also helping with the quality of the production is the decision to match a camera operator with a convergence puller. HBS even made the teamed-up camera and convergence operators swap roles to understand how the other half lives and allowing them to learn how things they do in their regular position can lead to problems for their partner.



The convergence puller must intuitively follow objects, like the football, out of the screen and make a decision to pull convergence in a split second. It’s a bit like a camera operator reacting to an event but the key thing is to understand the impact of what the convergence will achieve. While some people argue that convergence pullers should be replaced by automated convergence technology for financial reasons, the lesson learnt in South Africa so far is that for convergence pulling you need a human brain.


Telegenic 3D Cameras at Cape Town


Click to enlarge



Click to enlarge


The 3D Creative Aspect
While the first of the World Cup matches were focused on just using 3D to create uniform game coverage, the crews have now graduated to making the productions more sophisticated with better cutting and timing. With each match, the productions improve. Many of the veteran 2D crew members are relearning their craft at the World Cup.

For example, the crews have learned that shooting in 3D allows the production to "cross the line" more often by allowing the cutting to cameras on both sides of the field. With 2D video, if the action is moving from left to right all the camera cuts need to be from the same side of the field to prevent the viewer from getting disoriented. Not so with 3D. With 3D, it is easier for viewers to orient themselves, and there is an increased perception of where the camera is on the field.

Another lesson learnt is the need to frame the main stadium 3D cameras a bit tighter on game action. However, the camera operators face a delicate balance because if they are too tight on the shot it requires more panning. That can lead to quick movement, which can introduce motion blur and compression artifacts into the picture. It is a delicate balancing act for the camera operators, however already the coverage of the first game South Africa – Mexico presented stunning results.

During breakaway and wide-angle shots, the 3D effect was more subtle, as if you were watching from the stands. But in replay and other close-ups of individual players, the 3D effects almost put you into the action. The post-goal close-ups of celebrations of fans in the stands and players on the field brought home the energy of the event. Ironically, while the cameras in the stand are going a little tighter, the cameras on the field are going a little wider. Wider shots from the field level introduce more elements that can add depth to a scene.

The quality of game coverage remained high throughout the broadcast, with few if any transmission glitches or convergence problems. The highlights were the replays, with the replay of the South African goal when the ball seemed destined to fly out of the screen and hit the viewer, topping the list. Most of the game coverage shot from the traditional soccer upper-level camera positions also provided enough depth, giving viewers a sense of the distance between players and the speed of the action.


The Coverage of the Games in 3D: The Directors View
When Bruno Hullin, one of the two directors working on the 3D World Cup productions, sat down for his first 3D production on June 11 for the game between Mexico and South Africa, he knew the challenge ahead of him: how to take his skills from years of directing in 2D and apply them to 3D. And it’s a challenge that dozens of directors around the globe will face in the near future.

“The secret in 3D is that you start wide and zoom while, with the lower-pitch cameras, you need to be very wide and have players in the front of the image,” he says.

Telling the story of the match is paramount, and Hullin says, when relying primarily on cameras that are close to the action, the production team needs to learn new ways to follow the action.

“In 2D, I always cut by looking at the men on the pitch because that tells me what is happening,” says Hullin. “But, when I am on a 3D camera on the pitch, I have to look at the eyes of the players to understand what is happening and where to cut.”

One of the top tools in the 2D productions at the 2010 World Cup have been ultra-motion camera systems. Shooting at upwards of 300 frames per second has given viewers a great view of tightly shot with emotional facial reactions. For the moment, however, those shots don’t work for 3D.

“With 3D,” Hullin notes, “we need layers of objects and, with low, tight shots from ultra-motion systems, we only have one layer, which is the player or the coach on the background, so there is no depth.” That said, he does believe there is a role for those shots within a 3D production because the images are so impressive.

One of the issues still to be sorted out for all 3D broadcasts is the number of cameras. Does 3D need the same number as a 2D broadcast, or can it get by with fewer? Hullin says that eight cameras is enough for the 2010 World Cup but more cameras located at a lower level would help tell the story even better since pitch cameras are the primary tool for telling the story of the match in 3D.

“You can direct it like a 2D game, but it will not be interesting,” he says. “To be interesting, you need to find the spirit of 3D, because there are things that are possible in 3D that cannot be done in 2D. For example, you can stay with a wide shot in 3D while, in 2D, you would force the cut and force the view. But, in 3D, you allow the viewer to choose what they want to look at.”

On the creative side, 3D enables camera operators to finally leave the 4:3 “safe area” and use the full 16:9 screen area, as there is no letterboxing in 3D. In 3D, graphics can be pushed to the true edge of the picture and the full screen can be utilized for game action.


The 3D Master Control and Playout at the IBC
All the 3D productions in the stadiums were supervised by Peter Angell and Duncan Humphreys in a small 3D control environment at the IBC in Johannesburg direct next to the 3D playout rack.

“Ninety-nine percent of people will see the World Cup in 2D HD so we can’t do anything to risk that coverage,” stresses Angell. “As long as 3D is a premium event proposition, it will face issues in the short term at any stadium which is already full of cameras and paid seating. If we were starting from scratch it would be straightforward but the 3D element adds another eight positions to the 32 per World Cup venue already dedicated to the 2D host coverage so it is difficult to get space.”


Click to enlarge


One rule to have emerged about 3D outside broadcasts is that coverage requires fewer cameras than 2D, with cut-aways and replays not as necessary to tell the story. Nonetheless no specialty cameras have been included in the 3D mix for the World Cup with HBS looking to convert occasional 2D shots from the armory of its other cameras to augment the 3D.

“We have to be judicious about it,” Angell insists. “The goal is to tell the story as well as possible but that doesn’t mean littering the coverage with 2D shots. Ideally we need a means of cross conversion that retains enough of the 3D image so that it makes sense in the story we tell.”

Angell and Humphreys concluded that a conservative approach to 3D would be the best option, which meant devising a depth budget that wouldn’t jar the audience’s perception.

“We needed to decide exactly how to manage the depth budget and also how we decide to break it,” explains Humphreys. “Having it fixed at the beginning of production is fine but it’s important that you know how you can break that budget for effect and when it makes sense to do so.”

Footballs randomly booted into the stands and toward a 3D camera would make an obviously stunning 3D shot but decisions need taking about what the outer limits of the convergence should be. For the World Cup games the convergence was settled broadly on a depth budget of 2-2.5% positive parallax (into the screen) and 0.5-1% negative (out of the screen).


Click to enlarge


The FIFA World Cup 3D coverage will be delivered to cinema screens and to 3D TV sets. The position and size of graphical elements needed consideration for both types of viewing environment.

“Graphics can contribute to the overall 3D impact but if you play with them too much they become enormously distracting,” says Humphreys. “We initially put the clock and score as far into the screen corner as possible only to find it was more a problem in the cinema than it was for TV.”

“We are working out a set of values for where the graphics are best positioned on screen to give maximum effect without being completely overpowering,” says Angell. “The graphics will generally sit just in front of the screen plane, but if a player runs towards a camera we have the possibility of shifting it so we don’t end up with a situation where the graphic appears in front of the action when in 3D terms is should be behind. It’s a subtle trick to pull off.”

Recording is on HDCAM SR dual stream VTR (SRW5800) on-site and at the IBC in Johannesburg, as well as to an EVS XT[2] server controlled by IPDirector and one EVS XF[2] (removable storage) at the IBC for long term archive.



Discreet left and right eye channels of 1080i50 HDSDI are sent from the 3D OBVans to the IBC over a JPEG2000 contribution network compressed to 300Mbps. From the IBC two redundant 3D signals will be sent via satellite to European theaters and homes via London, the site of FIFA’s distribution partner GlobeCast, using eight International Datacasting encoders (two at each of four venues) with integrated Sensio Technologies 3D processing.

From London the 3D contend will be distributed to signal providers such as Eutelsat for transmission to cinemas across Europe.


The 3D Cinema Experience
It is not just at the IBC FIFA HD Cinema that the FIFA World Cup is being delivered in stunning 3D. Ten broadcast networks and 400 theaters are distributing the World Cup 3D feed, including ESPN, Al Jazeera, SBS Korea, SBS Australia, SogeCable in Spain, TF1 and Canal+ in France and SKY Perfect JSAT in Japan.

The 3D feed is being screened around the world as cinemas are being turned into impromptu ‘stadiums’ for 3D broadcasts. Cinema chains in Brazil, Mexico, the United States, Italy, Belgium, the UK, France and Spain Korea and Japan have signed up to receive the live 3D broadcast of the 2010 World Cup. The remaining four matches from the half finals onwards will be broadcasted to large screens including Gaumont and Europalace in France; Kineopolis in Belgium; Movieplex and Cine Cite in Italy and Digital Cinema Media in the UK, which has signed a deal with SuperVision Media to show both semifinals and the final in 40 screens across the Odeon, Cineworld, Vue and Empire chains. In the United States, Sensio is working with digital cinema delivery group Cinedigm.

By Reinhard Penzel, Live Production

China Market: Shinco, TCL Unveil 3D CBHD Players

China-based Shinco Electric and TCL, currently the only two vendors of China Blue High-definition Disc (CBHD), have debuted 3D CBHD players to compete with 3D Blu-ray Disc (BD) players in the China market.

3D CBHD players, with retail prices yet to be set, can support 3D TVs offered by several vendors such as Samsung, according to the China High-definition Disc Industry Promotion Association.

While Sony, Samsung and other vendors are offering their 3D BD players in bundles with their 3D TVs, the CBHD Special Interest Group is talking with China-based vendors of 3D TVs for bundled sales.

By Erica Yen and Adam Hwang, DigiTimes

DIRECTV Launches Suite of 3D Services

DIRECTV this morning officially flipped the switch on its new 3D channel, n3D as well as a 3D VOD and PPV service. Located at channels 103, 104, and 105 they signal a new opportunity for 3D content creators to reach viewers across the U.S. and a new opportunity for DIRECTV to tighten its bond with tech savvy subscribers.

“This just the beginning and we will have additional deals with partners and our own 3D productions,” says Steven Roberts, DIRECTV, senior vice president. “We will have more quality content as it becomes available to provide the best video experience for our customers. This is the next step in the TV revolution.”

The audience at the present time is nascent, at best, but Shiro Kitajima, president of Panasonic Consumer Electronics Company, the company sponsoring the DIRECTV 3D network, says that 3D has been embraced much more quickly than HD, thanks to movies.

“Consumers appreciate the value and, unlike the early HD sets, 3D is only marginally more expensive than high-end flat-screen TVs,” he says. “We’re finding that this will be a great success as we work together to create 3D sports, concerts, and entertainment.”

NASCAR in 3D and the MLB All-Star Game in 3D are two of the upcoming sport events that will be featured on the network. Other programming includes nature programs, concerts sponsored by Guitar Center (not full blown concerts but intimate in-studio performances), and more.

“We’re busy licensing content and on the production side the curve [of content available] is growing while the learning curve is getting shorter,” adds Roberts. “That means more and more content is going to be produced.”

Right now DIRECTV is not producing 2D programs converted to 3D but if the quality of the converted material can improve it could have a presence. “For now the focus is on native 3D.”

The 3D content is available over existing DIRECTV set-tops boxes with content delivered via side-by-side transmission at 1080p. It also culminates nearly four years of research and development.

“We have been discussing this for four years in planning meetings and it was about this time last year that we thought the perfect storm was coming,” adds Roberts. “Not only was our technology and infrastructure for 3D ready but there were great consumer monitors from companies like Panasonic that were creating an experience at home that was as good as being at the theater. And now we can scale 3D to millions of homes.”

How long it will take to get 3D into millions of homes remains to be seen but Roberts says that DIRECTV is in it for the long haul as it looks to educate consumers about a 3D experience that is more than just making viewers feel like spears are being thrown out of the TV set.

“The productions are only going to get better, the technology is only going to get better, and the monitors are only going to get better,” he adds. “It is now a part of the home entertainment experience and providing 3D is part of our commitment.”

By Ken Kerschbaumer, Sports Video Group