Real D CEO Michael V. Lewis is approaching the world of 3D Cinema from a different angle, being the head of the technology company who has the biggest job in terms of convincing film directors, producers, studios as well as the theater owners and exhibitors, that 3D is a worthwhile investment and indeed the way of the future. ComingSoon.net had a chance to sit down with Lewis during ShoWest to find out more about his company's origins and how they've been using the Real D technology over the past few years and how they'll continue to provide the main foundation for digital 3D projection as more and more studios get into the act.
ComingSoon.net: Let's start with a little background. How long ago did Real D get started?
Michael V. Lewis: We started the company back in 2003, and we thought there was a big opportunity in this next generation 3D area. We weren't exactly sure how we were going to get it out there to thousands of theaters, but our view was that we had to create a big enough platform, so that filmmakers would show up and the economics would make sense. What we did is we went out and looked at companies in other markets and other businesses using a very high-end 3D, and our first acquisition was a company called Stereographics. They've been in the high-end 3D visualization market for about thirty years, so they supplied technology for the military, for NASA, Fortune 5000 companies, to see the way that we see as human beings. We acquired them in 2005 and then we spent some time marrying that technology that had been used by those companies with a digital cinema projector, and this was the very early days of digital. The hope was that we can somehow bring next generation 3D science together with digital, which we hoped was going to transplant the analog way that we'd been seeing films for the last 80 years. We got to a place in June of '05 where we said, "You know what? Wow, that image looks pretty good," and it was a very simple upgrade to a digital projector—hardware and software—took about 15 minutes, and we had the Walt Disney Company come over. Dick Cook came over and said, "Wow, that's really great. Can you get 100 screens up for 'Chicken Little'?" So we went out and did deals with exhibitors and now the rest is history. We've done seven films that have now been shown in Real D and we're now at 1,200 screens worldwide. We're at about 1,000 at 680 locations in the U.S. Some locations have more than one Real D screen, multiple screens. We have 1800 time-tracked right now, so it's grown in two years. It's really exploded and we have 70 exhibitors in 25 countries now. We have a couple overseas in the international markets and that's taking off now. The biggest challenge for us right now is the digital cinema, and if we can get that rolling, which I think we will over the next few months, where there is digital, 90+% of those locations have Real D, so the 3D has been a big driver and the reasons is that the experience is better, but the exhibitors have been able to charge a lot more because it's a premium experience, it's better, and they've been able to get $15 a ticket for "Hannah Montana," all the shows.
CS: I wasn't sure if Disney was actually using Real D because they've released most of their movies as "Disney Digital 3D" although they obviously use the same glasses and technology.
Lewis: Yeah, they've kind of co-opted it. They use "Disney Digital" to talk about their production process, but there is confusion in the marketplace, so one of the things we're working on this year is really reinforcing the brand, and seeing Real D theaters, and that's one of the main things we're focusing on this year in terms of getting the word out to consumers that if you see a movie in Real D, it's going to be a great experience and we hope you come back.
CS: There's been an amazing amount of growth in the 3D market in the past few months. At least year's ShoWest, there was some talk about 3D but it's become this year's buzzword due to the number of box office hits.
Lewis: What has happened literally in the last three to six months is there've been a tremendous amount of people have been using the word "game change" or "biggest thing since color and sound" and it's been driven at the end of the day by filmmakers. We spend a lot of time with filmmakers. We bring them in and show it, and they get excited. They go, "Wow, this is a new way to tell my stories, this is cool. I want to play." So you got the best filmmakers in the world going "I'm doing this!" and at the end of the day, all the other stuff doesn't matter. It's about them wanting to tell their story in a new way.
CS: Obviously, there's a lot of animated movies being made using the technology, but the computer generated animation is generally 3D anyway, so what are some of the biggest challenges of doing the live action movies using the process?
Lewis: What you're doing with 3D very simply, we're trying to replicate the way that we see, and so, in animation, you're already in stereo, you're creating another eye if you will. Left eye, right eye, so you move the camera over. In live action, you're just using two cameras, a left and a right eye camera, and much as Real D has benefited on the delivery side to theaters with digital, that allowed us to do what we're doing now, get it right and perfect every time. Same thing with camera technology. If you used to need big 350 pound IMAX cameras to shoot 3D, now it's 2 digital cameras side-by-side. Cameron is using a 19-pound camera, it's just gotten a heck of a lot easier, so we have these perfect storm of digital helping us, digital helping the production, so all these things have come together. You saw the result today with what Jeffrey Katzenberg showed. The filmmakers are getting smart really fast, and the tools that allow them to do this are getting a lot easier to use. It used to be hard to edit all these problems that plagued 3D are being solved very quickly.
CS: How much more does it add in terms of cost to the production side to do something 3D?
Lewis: Well, it depends what it is. Animation, if you talk to Katzenberg, he'll say that DreamWorks is spending about 15 to 20 million dollars extra per film. You've got more rendering time, because you can't get away with the cheats that you get away in 2D because the eye picks it up. It's just more time-consuming. Live action people say anywhere from 10 to 25% of the below the line production budget, but again, if it's visual effects intensive, it's going to be more, if it's not so much, it's going to be less.
CS: Do you see a point where all movies are going to be done in 3D because moviegoers just get more used to that?
Lewis: We have all the big blockbusters over the next couple of years. Thirty 3D films I think have been announced over the next three years or in production, and some of the big movies, but I think you will see an expansion to a much broader audience. Younger filmmakers will start using it. A few weeks ago I had Baz Luhrman in our screening room and he was going, "Wow, this is fantastic." We spent hours going through and he said, "Well, I have to rethink this because my actors, they look so much smaller..." He was thinking in his brain how he was going to recompose his shots. He said that he wished he could have done "Australia" that way, because it would have been fantastic. I think the answer to your question is "yes" but I don't know in what period of time, is it five years? Ten years? I think it is very hard... it's much like color. When you shoot color, it's very hard to go back to black and white, and so we think this is the evolution of how we see in visual media, driven by cinema, so I think the answer is "yes."
CS: You're obviously working with Disney and with DreamWorks...
Lewis: We're working with all of them.
CS: Is it a matter of getting the filmmakers on board first or does it have to come from the studios that are willing to invest in using the technology?
Lewis: You know, it's both. It comes from all directions. In the case of Disney, they said "okay" so what we do is we provide the studios... we have a Pro line, we provide a lot of visualization technology so they can actually see what they're producing. Eyewear, different things, while you're shooting, so you can see what's going on. We provide that. Most of the screening rooms in Hollywood are now equipped with Real D so they now can see what the heck they're doing. We spent a lot of time doing outreach with directors, producers, production personnel, and we do a lot of 3D 101 classes.
CS: One of the things that's been discussed a lot is the cost of the projectors, and it's not just the amount for digital projectors but also the add-ons to be able to do Real D.
Lewis: Yes, so if you have a digital projector, then you're a candidate for a Real D upgrade and then we license our technology, includes all the maintenance, the upgrades. We're constantly improving the system. Today you saw an example of something that's never been done before, which is in the last two years, we've only been able to get the image to 45 feet in terms of screen width. Today, you saw 60 feet, and that was on a single projector, which has never been done before. We're light challenged but we came up with a technology that allows us to get recyclable light and show a much better image.
CS: I know Zemeckis when he made "Polar Express," he did it in IMAX 3D, so what did it take to get him to switch over to your system for "Beowulf"?
Lewis: He came over and he saw it and thought it looks great, and he likes the way digital looks. IMAX is film-based, we're digital, and so he said, "Well let's do our next film (which was 'Monster House')" and that's how we got into the Bob Zemeckis-Steve Starkey-Jack Rathke business, and the next one they're doing is "Christmas Carol" in 2009. It's using the same performance capture as "Beowulf" in 3D.
CS: As far as getting the theaters to be compatible to do 3D, do you think there'll be a point where every theater is going to be ready to show movies this way?
Lewis: I think the entire industry obviously is going to go digital. How long that takes is probably a five to seven year process to go all digital, and then any of those digital projectors can be upgraded, so right now most of our deals is we're doing a third of a chain in Real D. We just did a deal with Odeon last year, which is the biggest international chain. They have 1,500 theaters, we did a 500-screen deal, so it's about a third of the complex.
CS: There are other 3D technologies out there, so do you think there ever might be a format war issue like with HD DVD and Blu-Ray Disc?
Lewis: You know, I don't think so, because if DreamWorks makes a movie, it'll show on other formats, so it's not like DVD. Right now, we have 97% of the market. We're happy... well, actually, we're not happy with that. We want to get the other 3%.
CS: Now was the U2 movie shot using one of those other technologies?
Lewis: No, they shot it digitally and they showed it in IMAX. We had "Hannah Montana" which soaked up all our screens, so "U2 3D" had to go IMAX and it's now showing in Real D.
CS: What future content do you foresee being shown in Real D in the future? Obviously, there's a lot of big national acts who could have their shows filmed that way, but when U2, a huge act that sells out stadiums can't get as many people into theaters as Hannah Montana, it makes you wonder what the market can stand.
Lewis: I think a lot of it is marketing and distribution. One, you had the Walt Disney Company, which makes a big difference, but I think the opportunity is to program these theaters much like you program a TV station and with digital, you can do that, so we're going to see over the next couple of years and we actually did a live event last year with the NBA, we broadcast the All-Star Game directly to the Mandalay Bay in Real D, and we hope to do a lot more of that, to program these theaters with something new every time.
CS: One thing that a lot of people have mentioned this week as one of the benefits of 3D is that it helps fight piracy.
Lewis: Yeah, you'll see a blurry image without the glasses.
CS: Also, it's given people a new reason to go see movies in theaters. That said, do you think Real D will ever work on bringing that experience into home theaters for those who may want that experience?
Lewis: Well, we're focusing on cinema right now. One of the big advantages and why the theater operators love it so much is the differentiator, it's something unique to the cinema, it brings the magic back. Our view is that all visual display devices will go this way, whether it's your cell phone, your iPod, but right now, cinema has a minimum of a five-year window before there are other technologies in the home and so forth. We love cinema so we're going to make sure that cinema is the top of the food chain.
CS: Another factor at least until "Beowulf" came out was that 3D was a gimmick only for kids' movies, so are you seeing adults getting more comfortable with the glasses?
Lewis: Yeah, you know, it's interesting. It's generational because when we first got going, if you're over 30 and you grew up with red and green glasses, you go "Oh, I have to wear the glasses" but you put them on, you're comfortable and you forget about them. We tried to make them high end, so they don't look so cheesy and comfortable, but if you're younger, people say, "Oh, I GET to wear the glasses." Because kids are used to video games and all that, so it is generational, and hopefully, as more people experience Real D and they see how great it is, they're going to be okay with it. Eventually, it will go to no glasses. We're working on that."
Real D CEO Michael V. Lewis is approaching the world of 3D Cinema from a different angle, being the head of the technology company who has the biggest job in terms of convincing film directors, producers, studios as well as the theater owners and exhibitors, that 3D is a worthwhile investment and indeed the way of the future. ComingSoon.net had a chance to sit down with Lewis during ShoWest to find out more about his company's origins and how they've been using the Real D technology over the past few years and how they'll continue to provide the main foundation for digital 3D projection as more and more studios get into the act.
"There is a great deal of misunderstanding going on with Parallel vs. Converged topic. It is important to understand that there are actually three different categories of camera configuration (not just two):
(1) Parallel cameras WITHOUT image offset - In this model the lenses/cameras are parallel and the optical axes of the two cameras overlap at infinity. This model is usually what most people think of when the "Parallel" camera model is mentioned. Without any post shifting of the images to correct the zero parallax distance (ZPD), objects at infinity will be cast at the surface of the display and all other images will be cast in front of the display. Nothing appears behind the screen surface (no positive screen parallax).
(2) Parallel cameras WITH image offset - In this model the lenses/cameras are again parallel, however, the images are shifted either in post, or in the camera, by shifting the two imaging sensors (e.g. CCD) behind the lenses. This shifting of the images has the effect of changing the zero parallax distance (ZPD) (sometimes called the convergence distance). In this model, the ZPD is usually set mid way through the scene, meaning that some objects will appear behind the screen and some objects will appear in front of the screen.
(3) Toed-in cameras - In this model the cameras (and lenses) are rotated inwards so that their optical axes intersect at a point usually mid-way through the scene, so that some objects will appear behind the screen and some objects will appear in front of the screen. This is usually what most people think of when the "converged" camera model is mentioned. As has already been pointed out the "toed-in" camera model results in keystone distortion, but also "depth plane curvature" - flat planes can appear bowed in the centre toward the camera.
Note that I have not used the term "converged" to describe the last case. That is because it could be argued that both models (2) and (3) are converged. One definition of "converge" is "to come together". In the case of toed-in cameras, the optical axes of the two cameras obviously come together (usually to a point roughly midway in the scene). In the second camera model (parallel with image offset), if we consider the case where the image offset is generated by shifting the imaging sensor behind the lens, the optical axes of the two cameras are angled inwards (but not by rotation of the cameras/lenses). Where the optical axes intersect will be the zero parallax distance (assuming no further image offset in post). Since the two optical axes come together, the second model could also be called converged.
I am sure some will disagree with this interpretation, however I note that Lenny Lipton in a recent blog entry makes the same point as I am. Lenny's point is don't use the term "converged cameras" since it is so easily confused. My additional point would be don't just use the term "parallel cameras" since it is also so easily confused. If you do see the terms "parallel cameras" or "converged cameras" used, dig deeper to understand whether they mean camera model (1), (2) or (3).
It is important to point out that no-one in their right mind would use camera model (1) (Parallel WITHOUT image offset). As mentioned above, it would result in all images on a stereoscopic display being cast in front of the display and infinity objects would be shown on the surface of the 3D display. So really, when people talk about using the "Parallel camera model" they are probably really talking about using model (2) "Parallel cameras WITH image offset". In many ways it is equivalent to model (3) "toed-in cameras" but without the keystone distortion and depth-plane curvature. Model (2) is harder to achieve with real cameras than model (3) which is why people still tend to use model (3).
A few other important points:
- Convergence in itself does not create divergent 3D images (or divergent infinity). It is the poor selection of camera separation (interaxial) (i.e. too wide) when combined with close convergence, and large screen sizes, that produces divergent 3D images (divergent infinity).
- "Ghosts of the Abyss" by James Cameron provides some very good examples of what you should not do with stereoscopic cameras - e.g. pulling convergence. The biggest problem with the cameras used on this shoot was that the minimum interaxial separation of the stereoscopic cameras used was not suitable for the extreme closeups (and pulling convergence really close) that they did numerous times in this movie. The cameras were set side-by-side meaning that the minimum interaxial separation was quite large. I was pleased to read recently that Pace/Cameron now also have beam-splitter rigs - and therefore allowing more suitable camera separation, particularly for closeups.
- 3ality Digital used many 3D camera rigs to shoot U2-3D. Some were of their own design, but they didn't have enough and not enough time to make more, so they rented others in (including from Hineslab, and others). My understanding is that most of the stereoscopic camera rigs were of the beam-splitter design (which allows very narrow camera interaxial if required) rather than the side-by-side configuration.
- There are major differences in the ways that stereoscopic images can be captured in CG vs. CCD vs. film. Camera model (2) is very easy to achieve in CG, whereas it is difficult to achieve with CCDs which is why camera model (3) is often used. With film, there is no pixel accurate reference, so people are probably using camera model (2) while they think they are using camera model (1).
- If "keystone distortion" and "depth plane curvature" are still unclear to you, check out figures 8 and 7 respectively here. Note that this 1993 paper usually means "parallel cameras WITH image offset" when it refers to "parallel cameras" - which in turn could be contributing to the confusion."
By Andrew Woods
Monday, March 31, 2008
"Buzz Lightyear, Bono and Jonny Wilkinson are unlikely bedfellows, but together they are helping to push back the boundaries of TV and film production. Meanwhile, back on the sofa, viewers may soon need to buy yet another new telly.
Forty years after the inaugural colour TV broadcast was shown at the Riverside Studios, in west London, the same venue hosted another first this month: a live 3D test transmission beamed in by satellite. The event was England's RBS Six Nations rugby match against Scotland, shown to a select group of 200 people from across the sport, music and media industries.
The audience sat wearing special 3D glasses as BBC Sport, working in partnership with the 3D Firm, a consortium of specialist companies, sent pictures from Murrayfield to London, overlaid with a running commentary taken from Radio 5 Live.
By using only three cameras, the production had a minimalist feel and later one camera was withdrawn due to rain. As the audience emerged blinking from the auditorium, several likened the experience to that of being at a live game, and it was only the shocking quality of the rugby that led many people to stay in the bar for the second half.
Most of those present saw enough to suggest that live 3D was part of television's future, though how is unclear. There is speculation that major events such as the Olympics and World Cup will now be screened live in 3D on big screens in major cities.
But Aashish Chandarana, former Head of Innovation at BBC Sport, says the BBC has no plans for other live 3D transmissions. "This was a one-off. We don't have a strategy. It's important that we are always looking to see how we can improve things for audiences and this was about understanding the broadcast end of the chain. But what the market will take from this remains to be seen".
Current demand for 3D is coming mainly from the commercial sector, where a number of business models are emerging. However, as more content becomes available, the greater the push into the home market will be.
"For the time being screening will be limited to cinemas or bespoke locations," said Chris Dyer, operations director of Can Communicate, one of the firms in the 3D Consortium. He thinks that premium events, where the demand for tickets outstrips supply, lend themselves better to successful live 3D transmission. Some sporting big guns were present at the screening. Representatives from London 2012 and Fifa, football's world governing body, were among those donning the dark glasses. Francis Tellier, the man in charge of broadcasting the Fifa World Cup was hedging his bets: "This is not for today. But things move so fast, who knows what time-frame we are working toward?"
Sportswear company Reebok are using 3D as a promotional tool, creating a short film to showcase their sponsorship of boxer Amir Khan.
"We were interested in a way of getting closer to the action," says Steve Martin, chief executive of M&C Saatchi Sport and Entertainment. "It's expensive to create (but) Reebok were interested because they would get first-mover advantage – it's always about being first. Once it becomes commonplace, brands will move on to something else".
Likewise, cinema owners view 3D as a way of putting some distance between themselves and the burgeoning home-cinema market. According to Screen Digest, there were 47 digital 3D screens in the UK by the end of 2007, with forecasts suggesting this number will rise to 429 by 2011. Of the 1,298 digital 3D screens worldwide at the end of 2007, 75 per cent were in America. Many Hollywood studios have either recently released 3D movies, or have them in production. Disney-Pixar is re-releasing the 1995 hit animation film Toy Story in 3D, ahead of the third movie in the series, also in 3D, in 2010. Rival animation giant, DreamWorks, has committed to producing all its movies in 3D from 2009. U2 3D, a film of the Irish supergroup's live act, is currently in cinemas and Beowulf, starring Ray Winstone, got a pre-Christmas 3D release.
But it's the domestic television market that will determine whether 3D becomes the next big thing, or just another passing technology. And this may be a harder sell. There has been confusion and irritation over the roll-out of new widescreen high-definition TV sets, with many customers complaining that they bought an LCD set in anticipation of watching HD, only to be later advised that what they really needed was yet another expensive upgrade.
To get the full 3D effect, viewers will need to buy a stereoscopic television. Phillips has developed a prototype 132-inch 3D TV that offers an "out of screen" experience and does not require viewers to wear glasses. The first sets will come with a whopping £10,000 price tag.
"Now we need enough viewers to make it worthwhile," says Chris Dyer of the 3D consortium. Domestic sales will get a boost from the computer-games industry, which is producing compatible titles. "Once gamers have their screens they won't just want to play their games, they will want 3D content," says Dyer."
By Richard Gillis, The Independent
Monday, March 31, 2008
IRIDAS now offers support for the DALSA RAW 4K format. This advance allows filmmakers using SpeedGrade to take full advantage of the pristine image quality of the DALSA Origin camera with instant real-time review, conform, re-framing, and grading of native 4K RAW footage. Unrendered RAW files offer more image data at one-third the size of an equivalent DPX file, greatly reducing throughput and storage requirements.
IRIDAS is the only developer offering live de-Bayering of all available RAW formats. Currently the IRIDAS SpeedGrade and FrameCycler applications support ARRI D20 RAW, CineForm RAW (used by Silicon Imaging and others), Phantom RAW, and WEISSCAM RAW, as well as DALSA 4K RAW.
Source: Digital Cinematography
Saturday, March 29, 2008
HDTV and MXF file formats have been revolutionizing the broadcast and digital cinema industries, dramatically improving workflow and production facilities in video/IT environments. The market has now reached maturity. OpenCube Technologies is meeting these latest market requirements by providing its customers with professional solutions to reduce their lead time. At the upcoming NAB, the company will present a number of major new product releases.
XFConverter v1.1: Intuitive software that repurposes all types of file formats (MXF, Quicktime, GXF, AVI, etc.) with new, enhanced features for seamless connexions to Avid and Final Cut Pro editing solutions.
OpenCube SD v2.1 and OpenCube HD v2.1: Two robust DDR to streamline workflows, ingest MXF files, provide print to tape capabilities and create Digital Cinema Packages.
P2Soft v2.0: The ideal software to ingest, shot-log and manage all types of MXF P2 media.
MXFTk version 2.1: the new OpenCube MXF Toolkit. The OEM SDK offers MXF file VBI, Avid MXF file management, and improved HD essence handling capabilities, in particular with P2 MXF AVC-I and XDCam MXF Mpeg2 HD supports.
XFReader v2.1: The essence-agnostic MXF and GXF viewer for Windows, which permits the browsing of any media files on graphics and SDI screens, has now been adapted to work with the HD and SD Decklink boards, providing users with an efficient, cost-effective solution for transforming their PCs into MXF HD SDI players.
Saturday, March 29, 2008
Labels: IT Broadcast
Omneon is launching ProCast on the market following its acquisition of Castify Networks late last year. ProCast is a high performance file transport engine employing what the company claims is unique acceleration technology which allows for high-speed file transfers over a wide area network, writes Adrian Pennington.
With ProCast, users can move large media files over great distances "as fast and simply as if it were a local transfer." The core technology behind this system was obtained by Omneon in the acquisition of Castify in December 2007. File transfer speeds achieved with ProCast are orders of magnitude greater than FTP transfers and are not affected by distance. For example, a one hour DV25 file sent from Los Angeles to New York on a 450Mbps connection would take 30 hours via FTP, but takes only three minutes with ProCast. The same file sent from Beijing to New York would take 69 hours via FTP, but still takes just three minutes with ProCast.
ProCast is fully integrated with all Omneon server and storage products (MediaGrid, Spectrum, and MediaDeck). It allows for point-to-point and point-to-multipoint transfers of content between systems. Typical media applications for ProCast include transport of contributions from regional offices of large broadcasters, transfers between affiliates, transfers from post productions houses to clients and broadcasters, etc.
Source: TVB Europe
The creation of the 3D@Home Consortium has been driven by a desire by many of the leading 3D organizations to ensure the best possible 3-dimensional viewing experience to the billions of consumers in today's home entertainment marketplace.
To achieve this desire, several short term goals have been identified:
- facilitate the development of industry standards and their dissemination
- create and publish useful technical roadmaps
- develop educational materials for training, consumer and retail channels
There are many, diverse organizations working in burgeoning 3D supply chain, including: 3D content developers, technology companies creating software to manipulated digital content, companies with hardware platforms upon which it is shown, standards bodies, and educational institutions.
Representatives from every one of these segments has had input into the structure and scope of the 3D@Home Consortium. These representatives strongly believe that the industry can develop at an healthier and accelerated rate, while enriching the experience for consumers.
The Consortium is Open for Business. Join now and become a part of this very exciting industry-wide effort.
An interesting article by Mike Seymour.
Mike Seymour chats with Tim Baier, a vfx supervisor and 3D researcher who now specializes in 3D stereo productions, about the history, technology and trends in the field.
Download this podcast
"Maximum Throughput, Canadian developers of software-based solutions for networked storage infrastructure, will unveil the new MAXedit Web Edition, a subscription-based online editing service, at NAB.
It's the company's first venture into networked online editing. Maximum Throughput is best known for Sledgehammer, a storage system for HD editing used by customers like CBS, FOX News, NBC, ABC and the BBC.
With MAXedit Web Edition, videographers have a simple, streamlined editing system that's readily accessible whenever and wherever they need to work. Since it's a subscription-based online service, customers can upload their HD or SD content (including HDV, DVCPRO, MPEG-2, and H.264 formats) to the online server and edit and share it with other editors via a web browser interface.
MAXedit's adaptive streaming technology assures editors that they're always working with frame-accurate, real-time creative control. Because the editing is executed on the online server, editors are free to use inexpensive, standard PCs or laptops in their studios or in the field.
Also at NAB, Maximum Throughput will introduce MAXedit Server Edition, a cost-effective, server-based workgroup editing solution for editing uncompressed and compressed multi-resolution content. It allows multiple editors to edit content from a single server in their facility using standard PCs, MAXedit Server software, and a web browser without taxing their LAN's bandwidth."
By Michael Grotticelli, HD Studio
Wednesday, March 26, 2008
Labels: IT Broadcast
SRG SSR idée suisse, Swiss public broadcaster, has selected groundbreaking compression technology from Media Links in order to facilitate the distribution of lightly compressed HD signals over existing SD infrastructure.
SRG SSR idée suisse holds exclusive television rights to UEFA EURO 2008 and will be broadcasting all 31 matches played in Switzerland and Austria on its Schweizer Fernsehen (SF), Télévision Suisse Romande (TSR) and Televisione svizzera di lingua italiana (TSI) television stations. Football games will be shown in high definition format for the first time on the HD suisse channel.
The HD270 Coax allows the compression and transport of HD-SDI signals over 270 Mbit/s circuits with outstanding video quality at incredibly low latency. By utilising field-by-field encoding, the entire codec process adds less than 10 milliseconds of end-to-end delay. The encoder design generates a standard-compliant transport stream operating at 270 Mbit/s, enabling broadcasters to utilise existing SD video transport infrastructures for HD signals. Advanced video compression produces extremely high video quality. The dual-encoder can be set to three modes of quality versus delay, offering tremendous flexibility to suit various applications.
Fast Forward Video (FFV) has launched the Elite HD, the first camera-mounted digital video recorder (DVR) and player to harness the JPEG 2000 (J2K) compression codec for recording HD-SDI video signals.
The Elite HD gives broadcasters a powerful new option for recording high-quality video from HD-SDI cameras while reducing the costs of storage media.
Until now, broadcasters had to rely on the internal recording capabilities of their camcorders to record and playback HD-SDI video; however, in nearly every case, these recorders cannot match the video quality of the camera signal itself. In addition, onboard camera recorders require proprietary and expensive storage media, which can cost upwards of $500 for only 32 GB. In a typical workflow, these cards require another device to read the video and transfer it for editing.
Designed to mount on the back or base of a camcorder, the Elite HD accepts an incoming HD-SDI video signal with up to eight channels of embedded audio and uses J2K to record at data rates up to 100 Mbps with virtually no loss in signal quality. Video is stored to an off-the-shelf, hot-swappable 2.5-inch SATA drive, which provides up to 10 times more storage at a greatly reduced cost. In addition, the Elite HD enables streamlined, file-based workflows with its modular design, which can be easily detached from the camcorder and connected directly to a nonlinear editing system via USB cable.
The Elite HD supports all HD-SDI camcorders and other video sources with compatible rates, including the Iconix HR-1; Canon XL H1 and G1; Sony F900, XDCAM HD, and XDCAM EX; JVC GY-HD250; Panasonic GP-US932; and Toshiba IK-HD1. The Elite HD also works with any HD-SDI switcher with a compatible output format, making it ideal for recording live events.
"In the second programme in this series on stereoscopic 3D we turn our attention to Continental Europe, and the French market in particular. Bill Scanlon's guest is Philippe Gerard, founder and CTO of 3D production company 3DLized.
The conversation covers a lot of ground, including the pace of growth in Europe, 3D origination and 2D to 3D conversion, the role of the stereographer in 3D post production and the reaction of the French government to this emerging sector of the cinema industry."
Download this podcast
Source: The Broader Issue
Wednesday, March 26, 2008
"Camera manufacturer Iconix Video is planning to expand both organically and through acquisition, its new CEO Bruce Long said. His growth plan includes development of technology for stereoscopic 3-D production and postproduction.
"We are moving from camera company to integrated service provider," said Long, the former president and COO of National Lampoon, indicating that an acquisition was in Iconix's near future.
Iconix -- maker of small "point of view" high-definition cameras -- plans to expand its product line with a stereoscopic 3-D production and post pipeline, which will be exhibited at next month's National Association of Broadcasters Show.
At NAB, the company will present a new Iconix camera that will capture 2K, a higher resolution than the current model. Parts of the developing 3-D toolset -- notably the 3-D camera rig -- already have been tested or shown to select audiences. Iconix's developing 3-D arsenal also includes an on-set digital recording device, as well as post tools including a playback/conform system."
By Carolyn Giardina, The Hollywood Reporter
Monday, March 24, 2008
FSN Southwest and the Dallas Mavericks will provide a look into the future of sports television on March 25 when they team with PACE, the leader in digital 3D productions, to produce the first-ever NBA regular-season game live in 3D HD secured through PACE Fusion 3D and the third ever live sporting event presented in the innovative format.
The March 25 game against the Los Angeles Clippers from the American Airlines Center will be beamed across town via satellite into Dallas Mavericks owner Mark Cuban´s Magnolia Theatre in Dallas´ West Village where an invitation-only audience will watch unforgettable images through special 3D glasses using Sony´s SXRD 3D Projection System on an 18x42-foot screen, making it feel as if you´re sitting courtside. In addition to VIP guests, the audience will include over 100 lucky Mavericks fans, who can win tickets to the event by entering an online sweepstakes at mavs.com
FSN Southwest will utilize the proprietary PACE/Cameron Fusion Sports System to capture the action on the court and deliver a unique depth of field perspective to the Magnolia Theatre audience. Each of the four 3D systems that will be used is designed with two high-definition cameras that capture the left eye and right eye imagery separately and create one three-dimension effect. The result is a "wow" visual experience that makes the action seem so close and spectacular most viewers will probably forget they´re sitting miles away in a movie theatre.
The 3D HD production will be separate from FSN Southwest´s game telecast and will use the Mavericks´ radio call with announcers Chuck Cooperstein and Bob Ortegel describing the action. During timeouts and television commercial breaks, the 3D HD systems will cover the on-court festivities, allowing fans watching in the theatre to experience the in-arena atmosphere.
"We´re excited to be on the ground-floor of 3D HD with the Dallas Mavericks," said FSN Southwest Senior Vice President/General Manager Jon Heidtke. "Mark Cuban has always been one step ahead of everyone in technology. He was a pioneer of the internet revolution with the creation of broadcast.com, and he led the way in high-definition television with the launch of HD Net. Now he´s ahead of the game again as the first team owner to produce a regular-season game in 3D HD. We´re happy to be partners with him as we take a sneak peek into the future of sports television."
This will mark only the third live sporting event ever presented in 3D HD, all of them NBA productions using PACE Fusion 3D. The 2007 NBA All-Star game in Las Vegas was the ground-breaking event with an invitation-only viewing party at the Mandalay Bay Hotel. Game 2 of the 2007 San Antonio Spurs-Cleveland Cavaliers NBA Finals from San Antonio was shown in 3D HD to the public at Cleveland´s Quicken Loans Arena.
"We had a great response from the audience during the world´s first ever live 3D sports broadcast with the NBA games last year and expect a similar response in Dallas," said PACE CEO Vince Pace. "We can see this type of venue growing rapidly as fans continue to become acquainted with the Fusion 3D experience and see for themselves how our systems blur the lines between what is real and what is Fusion 3D."
Sony´s CineAlta(TM) 4K digital cinema projectors will be used for this event, with two SRX-R110 4K projectors in a double-stacked configuration.
"4K projection technology is the perfect complement to 3D cinema, where the goal is to provide a feeling of ´being there,´" said Andrew Stucker, director of Sony Electronics´ digital cinema systems group. "4K resolution, or four times the resolution of HDTV, can enhance the 3D experience several-fold, taking movie-going to a new level and transforming the audience from viewers into participants."
Dallas Mavericks Director of Broadcasting Dave Evans will produce the 3D HD telecast, while FSN Southwest Senior Executive Producer Mike Anastassiou will serve as director.
Source: Wallstreet Online
Friday, March 21, 2008
The technology would consist of a projection screen having a predetermined angularly-responsive reflective surface function which would produce depth perception to the viewer even though the image is produced by a flat device, according to a filing with United States Patent and Trademark Office.
"Modern three-dimensional (3D) display technologies are increasingly popular and practical not only in computer graphics, but in other diverse environments and technologies as well," Apple said in the 25-page filing. "Growing examples include medical diagnostics, flight simulation, air traffic control, battlefield simulation, weather diagnostics, entertainment, advertising, education, animation, virtual reality, robotics, biomechanical studies, scientific visualization, and so forth."
While common forms of such displays require shuttered or passively polarized eyewear, those approaches have not met with widespread acceptance because observers generally do not like to wear equipment over their eyes, the company said. Such approaches are also said to be impractical, and essentially unworkable, for projecting a 3D image to one or more casual passersby, to a group of collaborators, or to an entire audience such as when individuated projections are desired.
As a result, Apple proposes a three-dimensional display system having a projection screen with a predetermined angularly-responsive reflective surface function. Three-dimensional images would be respectively modulated in coordination with the predetermined angularly-responsive reflective surface function to define a programmable mirror with a programmable deflection angle.
This form of technology would cater to the continuing need for such practical autostereoscopic 3D displays that can also accommodate multiple viewers independently and simultaneously, the company said. Unlike 3D glasses or googles, it would provide simultaneous viewing in which each viewer could be presented with a uniquely customized autostereoscopic 3D image that could be entirely different from that being viewed simultaneously by any of the other viewers present, all within the same viewing environment, and all with complete freedom of movement.
According to the filing, this form of display could include a 3D/stereoscopic rendering engine that renders 3D images and may be implemented in firmware, software, or hardware. The 3D/stereoscopic rendering engine could also be part of a graphics card, code running on a graphics chip's graphics processor unit, a dedicated application specific integrated circuit, specific code running on the host CPU, and so forth.
"The 3D images that are rendered by the 3D/stereoscopic rendering engine are sent to a 3D/stereoscopic display through a suitable interconnect, such as an interconnect based upon the digital video interface (DVI) standard," Apple said. "The interconnect may be either wireless (e.g., using an 802.11x Wi-Fi standard, ultra wideband (UWB), or other suitable protocol), or wired (e.g., transmitted either in analog form, or digitally such as by transition minimized differential signaling (TMDS) or low voltage differential signaling (LVDS))."
A display interface and image splitter inside the 3D/stereoscopic display would divide the 3D images from the 3D/stereoscopic rendering engine into two 3D sub-images, namely a left sub-image and a right sub-image. The left and right sub-images would be modulated (including being turned on and off) in respective image modulators to enable and control optical projection by a projector of the left and right sub-images respectively into the observer's left and right eyes
"The observer's brain then combines the two projected optical sub-images into a 3D image to provide a 3D viewing experience for the observer," the filing explains. "The deflection into the observer's respective left and right eyes is accomplished using a projection screen. The projection screen, in combination with image data properly modulated [...] forms a mirror device that is a programmable mirror with a programmable deflection angle."
Broadly speaking, Apple said, this combination constitutes the projection screen as a programmable mirror that is a spatial filter, because the combination operates to cause light to reflect from the projection screen to the observer's particular left and right eyes as a function of the spatial locations of those respective eyes, and otherwise does not reflect light -- as if the light were filtered out.
A digital signal processor (DSP) in combination with a 3D imager would also determine the correct location of an observer with respect to the projection screen. Characteristics about the observer, such as the observer's head position, head tilt, and eye separation distance with respect to the projection screen would also be determined by the DSP and the imager.
"The 3D imager may be any suitable scanner or other known device for locating and determining the positions and characteristics of each observer," the company went on to say. "Such characteristics may include, for example, the heights of the observers, head orientations (rotation and tilt), arm and hand positions, and so forth."
In some embodiments, the 3D imager may be configured as an integral part of the projector, which could be configured to directly illuminate the observer as well as the projection screen. An appropriately located light sensor would then be positioned to pick up the illumination light that is reflected from the observer, determining his or her position relative to the display.
Apple added that the 3D imager and the light sensor could also provide a means for observer input: "For example, the volume in front of the projection screen in which the observer is positioned may be constituted by the 3D display system as a virtual display volume that is echoed as a 3D display on the projection screen. The virtual display volume can then be used for observer input. In one embodiment, the observer can then actuate, for example, a 3D representation of a button to activate certain features on a virtual active desktop. Such an active desktop would be represented virtually in the virtual display volume and, by virtue of the 3D projection on the projection screen, would appear to the observer as a 3D image in the virtual display volume in the immediate presence and proximity of the observer. Other human interface behaviors are similarly possible, as will be understood by persons of ordinary skill in the art in view of the present disclosure."
In concluding its filing, originally submitted back in September of 2006, the Cupertino-based company asserts that such display technology is "straight-forward, cost-effective, uncomplicated, highly versatile and effective, can be surprisingly and unobviously implemented by adapting known technologies, and are thus fully compatible with conventional manufacturing processes and technologies."
The IRT Broadcast Metadata exchange Format (BMF) is now available in the new version 1.2 providing new functions and scopes of application. The BMF standard enables the standardized exchange of metadata in the television production domain. Interested parties may register online for free download of the new BMF class model specification.
Already BMF 1.0 considered many different use cases of television production through one single standardized data model. BMF 1.2 extends the scope considerably. BMF 1.2 now also marks TV-items for internet offerings and, therefore, can be used for both media. Additional data structures for schema and programme planning have been added, as well as extensive possibilities for describing audio track allocations. The predefined data lists were extended as well. Hence, BMF 1.2 supports further television production processes. IRT provides on request a to BMF 1.2 associated XML-scheme. Due to the current registration of all classes and attributes at SMPTE (Society of Motion Picture and Television Engineers), BMF is accepted as an official exchange format allowing now also the standard-compliant exchange of metadata via the file format MXF (Material Exchange Format). With immediate effect, all interested broadcasting organisations and companies may freely download the new BMF 1.2 class model specification from IRT’s homepage.
The new BMF data model supports all the television production processes like programme planning, production planning, production, exchange, broadcasting and archiving. BMF is designed as a standardized (i.e. non-proprietary) class model for metadata and clearly describes the relationships of these data. With BMF, IRT provides the basis for a standardized metadata exchange in the television production. Existing data models can easily be adopted and further on be used internally. For this reason, BMF reduces the effort of adopting and integrating proprietary data interfaces.
As the basis for the development of the data model, IRT extensively analysed the production processes, from programme idea, via planning, to broadcasting. These considerations include the feature production with scenes as well as news containing items. In addition to the editorial and production systems also the interfaces to the current archiving systems of the German public broadcasters ARD and ZDF as well as the programme exchange were considered.
Monday, March 17, 2008
Labels: IT Broadcast
"The last two days were spent at ShoWest, the western gathering of the theater exhibitors and the equipment providers that service them. Not unexpectedly, the hottest topic was 3D digital cinema.
I had a chance to meet with the major players in digital cinema projection and learned quite a bit about their plans for 3D. Christie is now pushing a new two-projector 3D solution, as is IMAX, which signed a deal to use DLP technology. While 3D is popular with the exhibitors because it creates draw and revenue, the low light levels for single-projector 3D is a concern. Exhibitors wanted the two-projector solution, which is why Christie and IMAX reacted.
Xpand, a Slovenian company that offers a wide range of services in Europe, now has reached 100 theaters with a mix of active and passive glasses solutions. At ShoWest, the company announced it had consummated a deal to buy NuVision, a developer of 3D solutions and a manufacturer of active 3D glasses. The deal means manufacturing of glasses can be expanded and capital is available to create compelling 3D entertainment solutions.
Sony’s 3D solution is currently a twin-projector approach, but a single projector solution is in development. One way to do this is by packaging two light engines into a single chassis with a single projection lens. Sony may want to go to this option, as achieving the fast switching speed in the liquid crystal for a single engine solution may be difficult.
Panasonic is not going after the DCI-compliant part of the market, but instead, will focus on the e-cinema part — this means the main projector for art houses or theaters that want to show alternative content-like concerts or sports events. In addition, these e-cinema projectors can be used to run trailers, pre-show content and advertising. There is probably a pretty good market for these projectors.
I also met with GDC, a company that makes cinema servers, but also offers turnkey cinema solutions mostly in Asia using Christie or Barco projectors. I was surprised to learn that almost all cinemas being upgraded in Asia are opting to go with the 3D install.
The ability of Hollywood to make the full transition to digital and eliminate film will take some time. In yesterday’s Display Daily, Matt Brennesholtz noted that he thinks the US will be almost 100% digital by 2012, and with International about 2 years behind, 2014 could mark the earliest that film could be eliminated. I am not so sure that will be the case after listening to some of the speakers at ShoWest.
For example, several studio executives at one of the lunch panel sessions said the elimination of film is at least 10 years off. They agreed that outside of the US, the transition to digital is going slowly, with these areas perhaps two years behind the US.
But there are other factors slowing the transition now. I also attended an interesting session about the return on investment with a digital projection system vs. a film system. As one attendee commented, "Don’t you have any good news for us?" The gist of the analysis is that the cost of equipment and maintenance is going up and the lifetime of the equipment is going down.
In addition, we actually don’t have any DCI-compliant equipment right now - it meets the specifications of the Interop group, which is working with DCI and SMPTE to finalize the implementation of true DCI compliant equipment. This will happen within a year and half and will require significant equipment upgrades, including the servers and projectors. The virtual print fee will cover the cost of these upgrades for exhibitors if they bought through a third party that was part of the virtual print fee program, but if exhibitors buy directly, they are on their own for the new equipment.
The reality is that we are now past the early adopter phase and a chasm must be crossed before more mainstream adoption takes place, noted industry expert Michael Karagosian, in his seminar. He was actually asked if he would equip a new theater with digital or film. His answer was film if he was concerned about managing profits carefully, but digital if he felt flush and ready to take on a little more risk. There have got to be a fair number of conservative exhibitors out there, so I think the transition may be slower than commonly thought.
I was also able to see a long clip from Fly me to the Moon and the full length screening of Voyage to the Center of the Earth. As one insider told me, "Voyage was the best 1950’s 3D live action movie he has ever seen." By this, he meant that while there were some painful scenes and transitions in the movie, overall, it was quite good from a 3D creation point of view. I would agree with that, and when viewed in hindsight in a few years, the quality of this film will be easy to mark as an early work.
Fly me to the Moon is an animated film so the 3D creation is not as challenging - and the story line follows the first lunar landing by the US back in 1969. It is both an educational and entertaining movie (3 flies in space suits hitch a ride to the moon with the astronauts and save the spacecraft along the way). The movie clip was great, but the 3D trailer is a bit jarring.
Complete and comprehensive coverage will be in the next Large Display Report."
By Chris Chinnock, DisplayDaily
"MTBS is excited to be joined by David Naranjo, Director of Product Development for Mitsubishi Electric. He heads up their efforts in 3D HDTV, and talks about 3D DLP, laserTV, and consumer cinema and gaming.
When I think of Mitsubishi, I think of cars. After researching it, I found that your company or brand structure is somewhat different from other corporate entities. Can you elaborate more on how the Mitsubishi name is used and divided?
Mitsubishi represents more than 40 independent companies that carry the name. Mitsubishi Electric was founded in 1921 with operations in over 30 countries. Industries we serve include consumer electronics, energy, heavy equipment, and countless more. Mitsubishi Electric NA began operations in 1973. We have facilities in thirty states, Canada and Mexico, and roughly 4,000 employees.
How long has Mitsubishi Digital Electronics America been in the HDTV manufacturing business?
Mitsubishi Digital Electronics America shipped the first HDTV models in 1998.
Why did Mitsubishi take a sudden interest in offering stereoscopic 3D (S-3D) solutions to the consumer market?
Mitsubishi started offering 3D-Ready DLPs in 2007 and we expect to expand these models in 2008. After some due diligence with various technology partners, we realized that 3D Cinema was making significant strides in content and technology. It seems natural that we should focus on efforts to bring 3D to the home.
You are heading up this Mitsubishi S-3D initiative. How did you become personally interested and involved? Are you passionate about this technology? Why?
I became personally involved when I saw some of the most compelling and immersive 3D content that has been produced. This includes movies, live events (NBA All Star game, NFL games, XGames, etc), and PC games. The focus of Mitsubishi has always been to enhance the big screen immersive experience for consumers. 3D to the home is a natural strategic fit for Mitsubishi. The 3D experience is best viewed on a BIG screen TV. This is what we do best.
For Mitsubishi to go in the S-3D direction was a big decision to make. What was your biggest motivator?
The DLPTV technology has the core technology to provide the best 3D experience to the home. The biggest motivator was to leverage this advantage for DLPTV in order to provide the best 3D to the home experience.
Tell me about Texas Instruments (TI), and what is their "DLP 3D HDTV" technology all about?
The Digital Micromirror Device (DMD) and our 2007 Diamond DLPs and all of our 2008 DLP models are able to provide a 120Hz frequency through a separate IR emitter or through the screen. This facilitates the capability to send 60Hz to each eye on the eyewear which provides the highest contrast and best experience for consumers.
The technology is based on LCD shutter glasses where the lenses flicker between black and transparent very rapidly so each eye gets an independent image. Up until now, "pageflipping" was the solution of choice because you got a complete image in front of each eye at a time. The TI solution seems to be using a checkerboard pattern of some kind. Can you elaborate on what this is and how it works?
DLP with 120hz technology accepts two frames; left frame and right frame at 960 x 1080 resolution; and then recombines into a checkerboard pattern for display on a 1080p DLP. The DLPTV merges side by side formats to create the maximum resolution without sacrificing image quality or the 3D effect.
Tell me about the LCD shutter glasses themselves. Why is a high flicker rate important? Is there any risk of headaches and sickness with 3D HDTV? Why or why not?
Most TV’s display content at a 60 frame per second rate. The active eyewear shutters each left and right side at 60 frames per second to provide the consumers with a frequency that they are already accustomed to viewing. As such, the technology minimizes the risk of eye strain or headaches.
With earlier CRT technologies, they usually required lower resolutions to get higher refresh or flicker rates. Does 3D HDTV face that challenge?
DLPTV does not have any challenges in this regard. LCDs which Mitsubishi also develops and manufactures will also not have the same challenge.
One of your competitors has released a 3D HDTV Plasma. Do you see Mitsubishi following this direction too?
Our competitors also see the value of having 3D in the home as a large screen experience. The Plasma shown at CES was a 50" Plasma. Mitsubishi is also focused on expanding 3D in the home with key technologies such as DLP TV, and LCD TVs.
What price and solution range are 3D customers looking at to get a 3D HDTV solution in their home?
Consumers that we have surveyed and shown the 3D content on DLPTV are absolutely immersed in the experience. We expect to announce a solution for consumers later this year that will be very competitive and establish the market.
DLP solutions have faced declining sales in relation to LCD and plasma HDTV solutions. Why is this technology still attractive to Mitsubishi?
DLP is the only technology that offers the big screen experience for an affordable price. In addition, DLP is the only technology that offers the most compelling and immersive experience for 3D.
Tell us about LaserTV. What is it and how does it work?
Mitsubishi had the world premiere of LaserTV at CES 2008. More details will be announced in the coming months.
I have read that it uses much less power and promises far superior color accuracy compared to current solutions on the market. Can you give us some numbers to illustrate how much better this technology is?
Most HDTVs are capable of only displaying 40% of the colors that the human eye can see. LaserTV produces twice the color.
When I think of DLP, I think of big heavy HDTV solutions. How would you describe a LaserTV solution?
The concept of DLP being heavy no longer applies. Our 73" DLP that we introduced last year only weighed 96Lbs. By comparison, other technologies at this screen size can weigh several hundred pounds and costs three times as much. More details on LaserTV will be announced in the coming months.
To date, how many S-3D HDTV units have you sold? Any projections for 2008?
The 3D efforts have been primarily focused on the US market by Mitsubishi and a few other manufacturers. For 2007 - 2008, it is expected that over two million consumer display units will be in the marketplace.
Of the units that have been sold, what percentage of customers are aware that they own an S-3D solution? Do you think this was an influential component of their buying decision? Why or why not?
We have promoted the 3D-Ready capability of our TVs at retail as well as on our Mobile Showroom that is traveling across the US. All consumers that we have surveyed are anxious to have 3D content for the home.
Are you trying to push the benefits of S-3D right now, or are you waiting until you have more S-3D content in the market before going all out on S-3D promotion?
We are pushing the benefits right now. We are working with several key partners on providing content and a consumer solution for 2008. NDA prevents further discussion.
What’s out there right now for 3D HDTV? Games, movies?
There are several ways for consumers to get 3D on the 3D-Ready TV today. There are some companies that are providing a PC based solution that enables 2D PC games to be converted to 3D for display on the DLPTVs. In addition to PC games, 3D content has been created for several cinema movies. Amazon.com lists several 3D movies that consumers can purchase. We certainly anticipate more 3D movies to be available as well as broadcast and cable in the coming years.
I launched MTBS to grow the awareness, adoption, and proper implementation of S-3D in the consumer space. Tell us about some of the initiatives that Mitsubishi has been taking to achieve similar goals.
In addition to demonstrating 3D content on our DLPTVs at trade shows and events (CEDIA, CES, PGA events), we also have a 53’ Mobile Showroom that is traveling across the country as part of our retail marketing efforts. This trailer has a 3D demo on our 73" DLP, in addition to more than a dozen 1080p HDTVs, including Mitsubishi’s award-winning SuperSlim LCD flat panel and large screen HDTVs with DLP(r) technology that deliver maximum picture in minimum space. Incorporating Mitsubishi’s innovative UltraThin Frame design, the sets’ frame width is less than one inch wide, so a 46-inch LCD flat panel is only 42 inches wide and sleekly fits into a space equal to or smaller than the typical 42-inch plasma set. The TVs are also thinner and lighter: A 73-inch set weighs less than 100 pounds and has a depth of less than 18 inches, taking up less space than a 46-inch plasma TV on a base.
What has the response been to your consumer S-3D road show so far?
Consumers are captivated and immersed by the 3D experience. We have had consumers waiting up to 2 hours to see the demo. We are very pleased with the reaction to the 3D demo as well as to our 2007 Award-Winning line of 1080p DLPs and LCDTVs (e.g. CEDIA Excellence Awards and CES Product of the Year).
To make S-3D a success in the consumer industry, I’m a big believer in building leverage to create change. For the S-3D industry to be successful in the consumer markets, where do you think this leverage will come from?
Content is King. This has been true of any new display technology or device. When manufacturers launched HDTV products in the late 1990’s, consumers purchased for the capability to watch wide screen movies from their DVD’s. As more HDTV content became available from movies, broadcast, and cable, consumers gravitated toward the technology. We are starting to see the same for Blu-Ray players. As more compelling and exciting content becomes available in that format, consumers have an increased awareness and are purchasing products and content. I see the same hockey-stick effect happening for 3D. As we develop a consumer friendly 3D solution in addition to the display, content will follow.
Our members love to play games in S-3D. What gaming content is and will be available for 3D HDTV? Who do you see as the core companies making this happen right now?
PCs with the proper graphics card and software can be used to play several popular 2D PC games that are rendered as 3D. I expect to see more game support in the coming months. As this progresses, we clearly see console game hardware manufacturers also developing the capability for console games. Our Diamond 2007 DLPs and 2008 DLPs are able to leverage all of these technologies.
A lot of our members connect their 3D HDTVs to their PCs, so I think it is a complete myth that 3D HDTV is only good for consoles. However,console support for S-3D would be a good thing. Any educated guesses or information to share on when we can expect to see gaming console support for 3D HDTV solutions? By whom?
Console manufacturers are very interested in 3D. The graphics engines and CG already embedded in consoles and games can be facilitated to make consoles and games into 3D. It does take effort to make these conversions, and we have already seen efforts by several content gaming companies to make these conversions. At CES, TI demonstrated 3D gaming from an XBOX 360, and at CEATEC in Japan, game publishers were demonstrating 3D console games on other game platforms.
We have had the privilege of interviewing the likes of Joshua Greer, President of Real D, and Tim Partridge, EVP for Dolby Laboratories (Dolby 3D). Clearly the S-3D cinema space has taken off with 2:1 and 3:1 revenue compared to traditional 2D cinema. What are the challenges in getting these S-3D successes to the consumer market, and what steps is Mitsubishi taking or will be taking to accomplish this?
The 3D movie content has to be encoded into a Blu-Ray format from the studios for a “3D to the home” solution. In addition, the consumer purchasable solution needs to be simple for consumers to understand and setup. We are very focused on getting this consumer value proposition to the market. We are working with many of the studios and leaders in the 3D Cinema space.
While your product is innovative, S-3D has been around for some time, and it has clearly had its pitfalls. What’s different today? In addition to your product, how do you envision the S-3D industry’s future? Is it going to have explosive mass appeal, or a niche market success? Why?
3D has come a long way from an eyewear point of view as well as the actual content. Consumers that have seen 3D from Mitsubishi are stunned by the level of immersive experience. No one evens mentions the days of ole for 3D when anaglyphic glasses were used. The right consumer value proposition and the strong partnerships from many stakeholders will dictate the success or failure of 3D now and into the future.
Aside from selling millions of units, if an industry genie could give you anything you wanted, what three wishes would you ask for?
Content, content, and more content!
If readers could walk away with one message from this interview, what would it be?
The 3D Immersive experience is very different today than in the past. The display technology is here with DLPTVs, and we expect to expand to other display technologies. Mitsubishi has also announced the launch of LaserTV later this year. If you have seen 3D on a DLP and think it is incredible, wait until you see it on LaserTV! Not only is 2D a True Dimension Experience with LaserTV, 3D adds the immersive factor with twice the amount of color than any other HDTV display technology. Exciting times are ahead and Mitsubishi is leading the way! Also, MTBS has shown to be the most passionate and comprehensive website to educate and build awareness for S-3D."
Source: Meant To Be Seen
David Wooster and Duncan Humphreys from The3DFirm provide us some technical information regarding the recent the 3-D live rugby match.
What was the HD format used in the capture, encoding, compression and presentation etc?
What was the bit rate of the uplink for each of the streams?
19 Mbps on each stream. We played safe for this test so we did not to take up too much bandwidth but would anticipate going to 40 Mbps per stream in the future.
How where the streams time locked or bonded together?
Each camera was gen-locked. These feeds were then sent to the vision mixer where they were paired together so the mixer saw them as one (when you cut to camera one you actually cut the 2 cameras that were on position one) and then a fairly standard OB edit took place between camera positions. The output from the mixer or broadcast feed was then compressed as 2 SCPC ASI streams, multiplexed together and transmitted via satellite. This satellite signal was then received in London, decoded and fed to 2 Christie 8K HD projectors.
Half way through the game there was a video “glitch” what was that caused by?
Satellite issue due to weather. Also happened on 2D TV signal.
Was it circular polarization on the glasses or linear?
How many seconds behind live was it?
Approx 6-7 Seconds.
How did you re-balance the audio for the radio to the video sync?
We took the Radio Scotland commentary straight from their commentary booth shifted the sync slightly and mixed it on site with the international sound.
What was the sound format?
4.0 (Quadraphonic) with phantom centre speakers.
What was the make/model of the camera?
Pairs of broadcast Sony 950 cameras with wide angle HD Zooms on 3DFirm Calcutta 3D rigs.
Monday, March 17, 2008
"Eager to get American cinema complexes ready for a surge in 3-D movies next year, four major Hollywood studios announced on Tuesday a deal to subsidize the conversion of 10,000 theaters to digital projection systems.
The announcement, at ShoWest, the annual trade show that gathers theater owners and movie distributors here, overlooked one point: the theaters that could be converted under the deal have yet to agree to it.
The motion picture industry is racing to roll out digital projectors, not just because they avoid the costly printing and shipping of reels of film, but also because they’re needed to show the current generation of 3-D films, which have often been bonanzas at the box office. One, “Hannah Montana & Miley Cyrus: Best of Both Worlds Concert,” generated $31 million its opening weekend on only 683 screens, about one-fifth as many as the typical wide release.
Under the deal announced on Tuesday, the Walt Disney Company, 20th Century Fox, Paramount and Universal all agreed to pay “virtual print fees” for each movie they distribute digitally to the participating theaters. Theater owners will use the fees to buy the projectors, servers and other equipment needed — about $75,000 for each auditorium.
Also on Tuesday, Paramount executives confirmed that “Indiana Jones and the Kingdom of the Crystal Skull” would be released digitally, though its director, Steven Spielberg, has long insisted that his movies be released exclusively on film. Every movie that earned more than $100 million last year was released both digitally and on film.
Access Integrated Technologies concluded a first round of 3,740 theater conversions last year. It now must go out and sell its systems to other cinema owners. It has three years to accomplish those installations; the studios will pay the virtual print fees for up to 10 years.
The size of these virtual print fees was not disclosed, but one person involved said it would be around $800 per movie, per theater — down from about $1,000 in the first phase.
Chuck Viane, president of distribution at Disney, said the studios were insistent that theater owners cover more of the cost of converting, including maintenance. “We’ve always felt that exhibition had to have some skin in the game,” he said.
The announcement came as, in a separate deal, the nation’s three largest theater chains — Regal, Cinemark and AMC — were negotiating for what Variety reported would be a $1.1 billion line of credit to finance the conversion of their theaters to digital cinema. The three, bargaining as Digital Cinema Implementation Partners, own about 14,000 of the nation’s 37,000 screens."
By David M. Halbfinger, The New York Times
"It's coming up on three years since the Digital Cinema Initiatives consortium announced its specifications for d-cinema systems, and now d-cinema has begun to take off in earnest.
Yet for all the benefits that d-cinema was supposed to offer studios and exhibitors, interest in it has been fueled in large measure by a feature that was barely an afterthought for DCI: stereoscopic 3-D.
The DCI helped make 3-D more popular, and 3-D in turn has turned out to be a boost for DCI-compliant systems. Yet when it comes to 3-D, the DCI has arguably been a victim of its own success in that 3-D's growth took almost everyone by surprise.
The DCI was set up by studios to "establish and document voluntary specifications for an open architecture for digital cinema that ensures a uniform and high level of technical performance, reliability and quality control." In other words: to make d-cinema as standardized as 35mm prints and projectors. DCI wrote specifications, and then the Society of Motion Picture & Television Engineers (SMPTE) wrote standards for manufacturers.
Digital cinema consultant David Reisner explains, "The goal of the studios is to have single inventory or, as close as possible, one mastering." That seems to be working, as far as it goes. There is a single, consistent standard for a digital cinema "package" (the equivalent of a release print) for regular 2-D films.
But the rush to put 3-D in theaters got ahead of the specifications, so there are now competing proprietary 3-D systems, notably from Dolby and Real D. The competition may not be as dramatic as the homevideo format wars, but it is the kind of problem DCI was set up to avoid.
"There is one format for the digital files, but the digital files that look good for one (3-D) system aren't necessarily the files that look good for another system," Reisner says. "We have multiple 3-D systems out there, and we don't have a mechanized system to go from one to the other. Right now they have to be mastered individually."
That can be a headache for studios and distributors. Warner Bros. technology VP Wendy Aylsworth, who is also technical operations and engineering VP of SMPTE, says that for the worldwide release of "Beowulf," "Between 3-D formatting issues, servers that hadn't been upgraded to the latest configuration and subtitling issues, we probably released a dozen different versions of the movie for 3-D, as opposed to one version for the standard 2-D version."
Ahead of schedule
DCI had not anticipated that d-cinema would turn 3-D seemingly overnight from a problematic novelty into a bona fide attraction for audiences.
"We knew it was a possibility," Aylsworth says. "We just didn't know the studios and consumers would jump on this bandwagon. We thought it was 10 years out."
So DCI has been hustling to catch up, issuing an addendum last year to address 3-D and move toward a single master for 3-D. Then there's the issue of subtitles, which complicates matters.
"Do you burn the subtitles into the master or keep them as a separate file and merge them into the image? If the subtitles are jumping back and forth, that's going to make people seasick," says Reisner.
Still, the omission of 3-D specifications from the DCI may have some silver lining.
Rob Engle, senior stereographer and digital effects supervisor for Sony Pictures Imageworks, notes the technology for projecting 3-D on the bigscreen has improved markedly over the last year and is getting better all the time.
"The 3-D "Monster House," Engle says, was released at less than standard 2K resolution. "There was no equipment out there that could have played in its highest quality possible, so what's the point? Why would you release a master that nobody could play?"
Yet when "Beowulf" was released a year later, it was possible to screen the film in 3-D at full resolution.
"If they had done a standard a year ago, it wouldn't have been this level of quality, because there was no way the equipment could have done it. Now it can. So the spec you get now may be very different from the spec you would have gotten a year ago," Engle says.
Screen size dilemmas
Those improvements are raising new issues, though. Ray Feeney, president of RFX and one of the most respected technologists in the industry, warns that new projectors, bright enough to throw a 3-D film on screens as big as 60 feet wide, reveal some basic problems.
Until now, digital projectors limited commercial 3-D exhibition to screens roughly 25 to 30 feet. "When you go up in size, the images tend to diverge and make it difficult for your eye to resolve them as stereo," says Feeney. "When you go down in size the stereo effect lessens."
This is a problem for multiplex owners who want to run 3-D films through their normal cycle, starting them in big theaters and moving them to smaller ones over the course of their release window. Feeney believes this may require different masters for different-size screens, despite the best efforts of the DCI to avoid that.
"When (the DCI) set out to do digital cinema, they went at it in a thoughtful, scientifically researched approach to the problem," Feeney says, but with 3-D, the focus of research has been on how to create a single 3-D release package, not on the larger issues of 3-D.
"The studios need to take a look at it before they standardize on a package," he says. "I just think people are going to have to look at the higher-level issues behind this as part of a studio effort, not just say put the left eye this way, the right eye that way, and pack the bits this way."
This series of four podcasts is devoted to Digital 3D; for cinema, for use in business and for viewing in the home. Your host Bill Scanlon is founder of digital 3D production company Far Blue Images. In this first programme, Bill’s guests are Dave Monk, CEO of the European Digital Cinema Forum, and David Hancock, Head of Cinema at research company Screen Digest.
To start the series, they set the scene for the digital 3D business at the beginning of 2008; the finance, the technology and the films. How many 3D screens are in operation? Where are they and how are they being financed? What are the catalysts for growth? Who’s making digital 3D content? How are the public receiving it?
In coming programmes, we’ll discuss production, post production and the cinema presentation business, with industry experts and people whose future is tied to this exciting new format.
Download this podcast
Source: The Broader Issue
Monday, March 17, 2008
EVS, a world leader in broadcast production servers, announced integration with Final Cut Studio 2 and native support for Apple ProRes 422 to meet the broadcast industry’s requirements for High Definition cross platform workflow solutions. The XT production server from EVS is known for its extreme reliability, versatility, and incomparable speed. The XT serves as the foundation for broadcasters and production facilities looking to ingest, organize, playout, and exchange the highest quality video with post production systems for live and near-live broadcast production.
“By integrating the new Apple ProRes 422 codec, EVS takes another step forward to enlarge broadcasters' production and post production perspectives,” said Pierre L’Hoest, EVS Chief Executive Officer. “We are proud to give Final Cut Pro users what they've been asking for: the ability to access media recorded on our XT production servers instantly without requiring any transcoding. Now any HD XT server can be easily updated to integrate this new standard, allowing our customers to benefit immediately from this new capability."
“Final Cut Studio is becoming the video production suite of choice for broadcast and post-production professionals around the world,” said Rob Schoeben, Apple’s vice president of Applications Product Marketing. “Native support for Apple ProRes 422 in the XT means that the incredibly popular tool set of Final Cut Studio is now available to editors during sports and live event production."
The XT, combined with its related software applications such as IP Director and MulticamLSM, offers the flexibility to handle both SD and HD formats and boasts an “always-on” permanent loop recording capability. Every second of every event can be logged, and operators can attach searchable keywords and other metadata to the assets for comprehensive file management. The open architecture of the XT and its native support of the Apple ProRes 422 make it the most reliable and efficient system for the transfer and streaming of ingested media with associated metadata to Final Cut Pro craft editors in both SD and HD.
Thursday, March 13, 2008
Labels: IT Broadcast
"On Wednesday at ShoWest, Walden Media and New Line presented an advance screening of "Journey to the Center of the Earth 3D," which looks poised to be the first live-action narrative feature to be lensed and released in digital stereoscopic 3-D. It's slated for a July 11 release, though that could change as Warner Bros. takes over at New Line.
A fair number of animated films already have entered or gone through a 3-D stereoscopic pipeline. But the challenges of 3-D production in live-action filmmaking are still quite new and have been a frequent topic of discussion in the community.
"Journey" is the directorial debut of VFX veteran Eric Brevig, who shared a special achievement Academy Award for the visual effects on "Total Recall" and was Oscar-nominated for "Pearl Harbor" and "Hook."
"(3-D) is no more challenging that any other new technology," he says. "Whether it's visual effects, motion capture -- all of those things require the use of specialized equipment."
Brevig says "Journey" was planned as a 3-D feature from the start, and the action was designed to take advantage of the format. There were two clear messages: Filmmakers need to do their homework before starting production, and viewing work as the audience will see it is critical.
The adventure was shot on location and on stages in Montreal and Iceland. Brevig turned to Burbank-based Pace to use its stereoscopic HD digital studio camera system, developed by Vince Pace -- a veteran underwater and special effects cinematographer who founded the company -- and partner James Cameron. Brevig says that after a lot of early testing, "we knew the smartest ways to work with the cameras on set."
He says it is important for filmmakers to view dailies in the theatrical environment. For "Journey," a 30-foot screen and two projectors for right eye/left eye were installed on set.
"We had to pioneer a path for our image data so we could view it in 3-D when needed," Brevig says, moving to the subject of postproduction. "We cut in 2-D, then conformed in 3-D and (checked the shots), then we made adjustments. ... What we did, which we think is going to prove desirable, is color grading and adjustments of depth in real time in a theater environment."
He adds: "Part of the editing is just to make sure (3-D) is a comfortable experience for the viewer. You can have a wonderful off-the-screen 3-D moment, but you also want to allow the viewer to enjoy the movie without having to do eye calisthenics. That's something you can't judge on a small screen."
Visual effects, meanwhile, posed some unique challenges.
One was in the area of compositing, which is the process of combining separate visual elements (i.e., live action with animation) into a single image.
"You have to do two composites, a right eye and a left eye," Brevig says. "And, when viewed together in stereo, the layers of the composites (need to be) in the proper Z space (a term used to describe distance from the viewer). ... Cheats that you can do in 2-D you can't in 3-D."
Therefore, he says, everything has to be rendered twice, taking twice as long.
For visual effects, the work was handled at facilities including Weta Workshop ("The Lord of the Rings" trilogy), Rhythm and Hues ("The Golden Compass") and Sony Pictures Imageworks (the "Spider-Man" films)."
By Carolyn Giardina, The Hollywood Reporter
"GDC Technology of Singapore has been a driving force behind Asian digital cinema for the past seven years. We recently spoke with GDC’s founder, Dr. Man-Nang Chong, about China’s ongoing 2,000-screen deployment and the Asian digital-cinema business in general.
FJI: With over 1,000 installations worldwide and now with the recent deployments in China, GDC is positioned as the world’s second-largest integrator of digital systems. Can you give us an overview of GDC’s business strategy?
Dr. Chong: First, we continue to allocate our resources on research, technology and manufacturing. I like to believe GDC prevails by taking feedback from our customers and building cost-effective solutions that meet their needs. By listening to the exhibitors and supporting them, GDC continues to grow its market share through trust and respect. In China, we have grown our market size from almost zero to more than 95% today. Our market share in Singapore, Thailand, Taiwan, Hong Kong and Korea is more than 60% combined. Our customers realize that they could count on GDC for the past seven years and they can continue to rely on us for many years to come.
FJI: China has committed to installing 2,000 DCI-capable digital-cinema systems in the next two years. Can you give us an overview of GDC’s role in the deployment and how the partnership came about?
Dr. Chong: In 2006, the Institute of Digital Media Technology Limited [IDMT] entered into a cooperation agreement with China Film Group Corporation [CFGC] for a term of ten years, whereby IDMT and CFGC agreed to jointly promote digital-cinema business in China. GDC is engaged to supply, install, maintain and network-manage the digital-cinema systems at the top 100 cinemas in China.
FJI: What is the status of the deployment today and is it going as expected?
Dr. Chong: We have retrofitted close to 500 cinema theatres with our DCI-2000 Integrated Projection System so far. Recently, Sony Pictures’ Pursuit of Happyness was released in China in digital only with more than 500 digital copies. That said, the number of installations falls short of our year-end goal of 700 installations due to the unavailability of projectors and delivery delays caused by China’s severe winter weather.
FJI: Mr. Cheng Yang, CFGC manager of digital cinemas, gave us an impressive presentation at CineAsia explaining how they have arranged their theatres into several tiers of quality. Can you give us an overview of how China has segmented the digital-cinema market?
Dr. Chong: We believe that almost all commercial cinemas in China cities will eventually install DCI-compliant systems, although some second-run cinemas may choose to install non-compliant systems for showing local content. For non-commercial cinemas in smaller towns and rural areas, China’s SARFT [State Administration of Radio, Film and TV] is promoting low-cost digital systems as an alternative to their existing 16mm projectors.
FJI: I understand that GDC is also managing the distribution of feature content, including pre-show and advertising. Can you give us an overview of the network GDC is building to support distribution to the cinemas?
Dr. Chong: GDC provides the network operations center [NOC] service to customers such as CFGC, Dadi and Golden Harvest cinema chains via modem and ADSL. With the NOC, we can deliver the KDM keys, pre-show content, onscreen advertisements and in-foyer advertisements. Today, some of these cinema multiplexes’ foyers feature LED panels, all connected to a GDC SDM4000 Display Maestro that is capable of delivering multiple HD streams. All the displays in the foyer are connected to our Theater Management System (TMS), which is linked to the cinema’s ticketing system for scheduling. Using our TMS, cinema operators can now display the full array of trailers, advertisements and other pre-show content on both the in-foyer displays and in-theatre screens from a central point of control.
FJI: Security is a large part of the DCI specification. How is security being managed and how are the keys distributed?
Dr. Chong: GDC’s EN2000 encoder has been deployed in China and Hong Kong, where the DCI encoding is performed in faster than real-time. The encoder also provides DCI-compliant KDM key management for the distribution of the content. Depending on the distributors, the KDM keys can be delivered to the cinemas via e-mail, NOC or physical delivery.
FJI: How is the supply of digital titles in China? How much from Hollywood and what is available from Chinese distributors?
Dr. Chong: Since the first installation in May 2002, the digital screens in China never go dark. There has always been a strong supply of content competing for the digital screens, whether it is from Hollywood, China, Hong Kong or other markets. There are reasons for such demand: 1) the digital-cinema theatres are the top-grossing cinemas in China, 2) the huge savings in distribution cost, and 3) the limited number of digital-cinema theatres in China. We observed in quite a few instances that the distributors were competing for the same digital screens in China.
FJI: At the theatres, how have the Chinese exhibitors accepted the transition to digital? Have there been any unexpected obstacles in getting the theatres to accept the technology?
Dr. Chong: Most of the deployment work is being done by CFGC. GDC plays the role of supplying, installing and maintaining the network. Judging from the high number of installations within a few months, I believe the Chinese exhibitors must be comfortable with the digital systems. In fact, one province has almost been completely retrofitted. Since the availability of the installed digital systems is approximately 99.8%, their ease of use certainly prevails over the legacy 35mm projectors.
FJI: What is the status of 3D in the Chinese and Asian markets?
Dr. Chong: 3D digital cinema only recently began to receive attention in Asian markets with the exception of Korean exhibitors, where GDC shares more than 60% of the Korean 3D market. In Hong Kong, three out of four 3D digital-cinema installations use GDC servers. We also have a strong presence in Taiwan and Singapore for 3D digital cinema.
In China, there is no digital 3D yet, but certainly the wave will catch on when the 3D content is available. China is evaluating the 3D technology and the business terms with the current providers. Most exhibitors in these regions would like to avoid trapping themselves with a proprietary technology that depends upon a single supplier of glasses and services. The Asian moviegoers seem to prefer large-screen 3D presentations than the smaller screens required when using existing 3D technologies. The disadvantage of the existing 3D solutions is their low light levels. Whether the 3D is based on polarization or legacy color-separation technology, a single projector can only offer so much light.
We believe a non-proprietary 3D process that drives two digital-cinema projectors is an attractive alternative considering the declining price of projectors. Besides, you will get considerably more light output and more color information by having the server driving two projectors compared with the single-projector 3D presentations.
We invited delegates to view GDC’s True 3D Digital Cinema server at work and they were amazed with the brightness and vivid color presentation. Our server is currently the only one in the market that provides two streams of synchronized 12-bit 4:4:4 images and is agnostic to 3D technologies. It also supports dual-projector as well as various single-projector 3D systems.
FJI: Aside from feature titles, has alternative content been used?
Dr. Chong: In Singapore, the digital-cinema theatres have successfully played opera in the theatres for more than a month and I was told the response was quite good. We are not aware of any alternative content plan for China digital-cinema theatres. I like to think China must build enough digital-cinema theatres to avoid competing for the digital screens before planning for alternative content.
FJI: What has been the audience reaction to digital cinema in China?
Dr. Chong: The majority of moviegoers do not differentiate the traditional 35mm from digital cinema and there is no specific effort to promote digital cinema in China. I believe it is best to leave the moviegoers to enjoy the movies.
FJI: You have been a leader in the Asian digital cinema business since 2000. As a final question, I would like to hear your thoughts on where GDC and the industry are going and what may be in store for the next few years.
Dr. Chong: We constantly face the predicament between innovation and following the rule. Although the DCI specifications have leveled the playing field for server manufacturers, GDC avoids making a commodity server that is no different from others.
The debate on 4K versus 2K is an interesting issue. The 2K projectors have been delivering bright and vibrant images that meet today’s needs. Besides, the wide installed base of 2K systems over a long period of time has proven its technology and reliability. Perhaps super-bright 4K projectors could be used in very large-screen cinemas for special-venue applications.
I still hate to answer the question “when the transition from analog to digital will be complete,” but at least now no one asks the question “when the transition will begin” anymore. I believe the impasse between the distributors and the exhibitors is really about how to share the cost in the transition, and this is slowly being resolved. With regard to China, the transition from 35mm to digital is continuing and it should take approximately three years for all the top-tier cinema multiplexes to convert to digital."
By Bill Mead, Film Journal International