NFL Football Experiments with 3D Broadcasts
Last night was the first live telecast of a 3D NFL football game. It was transmitted via satellite to theaters in Los Angeles, New York and Boston. I had an opportunity to see the game (at least until halftime) in New York. Overall, it was an impressive performance for a first time event. 3Ality and Crosscreek Productions provided the camera work and video processing; Technicolor sent the signal to the theaters via its satellite system and RealD provided the theatrical projector system in the Chelsea Clearview Theater in New York. The San Diego Chargers demolished the Oakland Raiders 34 to 7. Here’s my scorecard for the evening.
The 3D camera work was done with operators who were briefly trained in 3D. Shooting in 3D is different than shooting in 2D. Pans should be slower, lensing may be different and scenes need to be composed to avoid 3D objects extending off the border of the screen (frame violations). Following the fast action of a football game is a challenge in 3D. The crew had nine cameras, but lacked the overhead shot which helps set the scene. The camera operators or the stereoscopic director must also set the parallax, depth of field, focus and toe-in of the cameras to create a good stereoscopic effect.
I have to give this aspect a C+ because I saw many things that bothered me. Okay, I’m nit picking a bit, as many others thought the 3D quality was terrific, so let me offer this as constructive criticism.
First, the naturalness of the 3D effect varied quite a bit. Medium distance shots are clearly the best. Here the perspective seems natural as it looks like one would see the scene in real life. Medium range shots of the players on the sideline or of the cheerleaders looked great.
Close-up and long-distance shots are more problematic. Close-up shots created eyestrain as objects sometimes came out of the screen, however most of the action was in back of the screen giving a through the window look, which was the right way to do it.
Sometimes in-focus foreground objects can distract from the focus of the shot, as our eyes (at least mine) are drawn to them. The down-marker pole and the overhead camera on cables (which seemed to float in space) are two examples I remember. In 2D, we see past these objects, but in 3D they are distracting.
Longer shots had several visual artifacts that, for me, created some unnatural scenes. For example, field level shots would often show a referee in the mid-foreground, then the two teams some yards away. It was very difficult to tell how far away the ref was from the teams. The size of the ref and the players all seemed about the same size, creating a confusing depth cue.
In some shots, the depth of field was narrowed so that a mid-distance player was in focus, with the background out of focus. This worked well. But if that was reversed with the foreground out of focus, I found it jarring. In other scenes where players were framed on the sideline in the foreground, with players on the field, the sideline players looked a bit like cardboard cutouts on a green background.
Other shots featured a foreground or medium-distance object with the stadium seats in the background. Sometimes this produced good depth to the stadium background, but other shots produced a very flat background.
I suspect many of these issues have to do with 3D camera settings and learning how to improve adjust these settings is all part of this experiment.
There are other aspects of the broadcast to be considered, too. For instance, the crew did a great job on production techniques like slow motion, quick replays and the addition of graphics. I would recommend keeping the graphics, like the score, at the edge of the screen and in the screen plane (it was a little in front and created some minor eyestrain). 3D processing also was very good at making cuts that maintained the depth of the scene - a big factor in eliminating eyestrain. There were some glitches that created some very painful scenes, but this was a small percentage of the shots. Overall, I would give this a B+.
On the distribution side, the satellite signal was lost two times for 5 minutes, which is OK for a demo, but unacceptable for paying audiences.
For the display of the content, a RealD equipped digital cinema projector was used in the theater with two LCD TVs in the lobby outside the theater. I preferred the LCD TVs, as the theater image was harder to acquire and created some eyestrain. Both required passive polarized glasses.
In addition, the projection display also looked a bit softer. This may have been a result of the larger magnification of the 720p image and some compression artifacts. That’s because the stereo signal from the camera was combined into a single image stream and sent to the truck for processing. In the NBA games that have been shot previously, dual streams were processing in parallel and transmitted for closed circuit display. The LCD 3D TVs get an A and the projection system a B.
I am sure others will have a different opinion of what they saw, but I hope this critique is viewed in the spirit in which it is intended - to improve the process. Last night’s event was like the first workout at spring training camp - you know these pros are just going to get better and better as the season rolls on.
And the good news is that we will get another look in a month as 3Ality and FOX have decided to do a similar broadcast of the BCS college football match up in January (it will occur during CES, so look for some live demos at this event). Congratulations to the whole team for making this a successful demo.
By Chris Chinnock, DisplayDaily