For several years now, the Indian media & entertainment industry has been talking about the emergence of digital cinema in India. But Ranjit Thakur, CEO of Scrabble Entertainment, says his company is the sole digital cinema provider in India. He tells FE’s Pritha Mitra Dasgupta that all other companies have been using electronic cinema and passing it off as digital.
What is the core business of Scrabble Entertainment?
Scrabble was formed about a year ago. We are the only 2K DCI-compliant digital cinema deployment entity in India. The current format used in cinema is D5 or DI frame, which we convert, encrypt and encode to the 2K platform, which is the higher version or the digital format. We have plans to venture into advertising on digital cinema but not this year.
What kind of research did you do before launching this technology in India?
I have spent much in the US, and we observed that all of India was using electronic cinema done by UFO, Real Image and others that was deemed as digital cinema. The real digital cinema is controlled by a company called Digital Cinema Initiatives (DCI), which sets specifications for the digital cinema worldwide. All the Hollywood studios have come on board on that standard and, therefore, only systems that are DCI-compliant get Hollywood content. Therefore, whatever was there before we entered was the electronic cinema platform. No Hollywood movies made out of the six big studios were being played on UFO or Real Image platform. We provide hardware to the cinema for digital playback at a subsidised cost, and also convert, encrypt & encode the content from a D5 or DI frame and deliver it to the cinema.
Tell us some of the core advantages of the 2K DCI system?
DCI sets rules for two things: security and quality. Security is taken care of in the way our servers are made and the way the content comes. Every key is server-specific. Every server (per screen) has two parts to the passwords, out of which one part is embedded in the server itself. So, if I give you the content and the password it is still no good, as one will also need to know the password that is embedded in the system. For example, if Dark Knight is to be played in Inox Nariman Point on screen two, it will not play on screen one or three. These are some of the things DCI agreed to when it formed the body because Hollywood studios are investing huge sums and security is a key concern for them.
Besides, it saves 100% cost of the print. Simultaneous, nationwide release of a film is also possible. It can maximise advertisement revenue and open doors for displaying alternate content. Because of global standards of ‘projection’ quality, simultaneous release of Hollywood films in India is possible. And finally, it creates a platform for 3D releases.
Will you be concentrating on multiplexes or will you eventually get into cineplexes as well?
Multiplexes have been mushrooming for the last 4-5 years. The number of Hindi films released every year is not enough to fill show times. Sadly, of the 700 films released in Hollywood every year, only 50 are released in India. And trust me, the rest are not so bad. The reason they do not come to India is because the print cost is extremely high. So, once they have the digital platform in India, it will be possible for them to have date releases in India, and we will see more Hollywood films here. Today, multiplexes contribute 80% of the all-India box-office collection and, therefore, the potential is huge. So I have not concentrated on the rest of the market.
How much does it cost to convert a screen into DCI-compliant digital cinema?
We have our contracts both with exhibitors and content providers, like distributors and producers. For exhibitors, the contracts states that we pay 25% of the equipment cost and in lieu of that they allow us a 10-year window to provide them content for all their screens. From producers we charge Rs 20,000 (USD430), which is the cost of virtual print fee per multiplex and that’s our revenue model. So we are like the courier of a movie. When we get a film, we make the hardware, inject a password, make DCI-compliant tests and then deliver it.
What is your biggest challenge?
The challenge is at the exhibitors’ end. They do not know what to do with the analog systems, once they convert a theatre into digital. Therefore, while they convert some of their screens into digital, they also need to keep increasing the number of multiplexes where they can use the analog systems or else these simply become scrap.
Source: The Financial Express
For several years now, the Indian media & entertainment industry has been talking about the emergence of digital cinema in India. But Ranjit Thakur, CEO of Scrabble Entertainment, says his company is the sole digital cinema provider in India. He tells FE’s Pritha Mitra Dasgupta that all other companies have been using electronic cinema and passing it off as digital.
Avid announced that it has signed a R3D Software Developer’s Kit (SDK) License agreement – as part of the company’s continued commitment to enhancing its RED camera workflows. You can get more information on Avid/RED workflows, including a step-by-step guide and other technical resources.
Source: Digital Production Buzz
A consortium of the nation's top movie-theater chains will announce within two weeks a $1 billion-plus financing agreement with four major studios to equip more than 15,000 screens nationwide with digital-projection systems during the next three years.
Universal, Paramount, Disney and Fox -- with financing backing from JPMorgan Securities -- are expected to announce the funding of systems to be rolled out by New York-based Digital Cinema Implementation Partners. Regal Cinemas, AMC Entertainment and Cinemark formed DCIP more than a year ago, but it's taken until recently to get a majority of the major studios to sign off on so-called virtual print fee agreements to fund the digital rollout.
The deal means studios will pay a majority of the roughly $100,000 per system in hardware and installation costs to install d-cinema equipment in theaters operated by the biggest theater chains. That will facilitate not only digital projection in the converted auditoriums but potentially also 3-D exhibition, if the theater owners decide to take on the extra, more modest expense for 3-D installations on their own.
Through VPFs, studios agree to pay for several years' sums equal to the print costs they would have incurred had the auditoriums not been converted to digital projection. Eventually, digital distribution will save studios millions annually.
Warner Bros. and Sony aren't expected to be on board with the DCIP agreement by the time it's announced. But they likely also will agree to consortium VPFs before long, according to a source familiar with the studio discussions.
DCIP chief and exhibition vet Travis Reid was unavailable for comment. But it's broadly expected that the DCIP will announce a deal with the first four major studios sometime before the Oct. 13 start of the ShowEast convention of theater owners in Orlando.
One lingering concern sure to stimulate lots of talk at the confab is the question of how regional circuits and mom-and-pop exhibs will pay for their digital conversions. Some have suggested that smaller-fry exhibs could be driven out of business unless they somehow manage to clamber onto the digital bandwagon.
In addition to representing a gateway technology for 3-D exhibition, digital projection also facilitates the implementation of TV-like advertising on movie screens. Regal and others are already reaping big boosts in revenue by moving to such ads to supplement or replace old-fashioned advertising slide shows.
D-cinema has been considered an inevitable development in movie exhibition for years. Haggling over deal terms has kept the studios and theater owners from getting going on a big rollout of movie-quality equipment more quickly, though Regal spent heavily to install more basic digital projectors to show commercials before feature presentations in its theaters.
The looming DCIP agreement is expected to result in the installation of upward of 3,000 digital systems in its first year of implementation. Those are likely to be sprinkled among major and midsize markets nationwide.
Circuits then may choose to do their subsequent digital rollouts market-by-market, once it gets a minimal number of installations in place. That could allow a quicker rollout, but the nationwide first phase is needed to facilitate the speedy addition of 3-D systems in key markets.
By Carl DiOrio, The Hollywood Reporter
The advent of Digital 3-D has triggered spirited debates in the movie industry. To some advocates, like Jeffrey Katzenberg, the boss of DreamWorks Animation, the new technology will expand the audience and provide an important new economic impetus to the industry. Skeptics feel these projections are over-optimistic and that important obstacles stand in the way.
These negative theories were embraced in a column by Patrick Goldstein that appeared Sept. 16 in the Los Angeles Times. Katzenberg says he asked the Times to publish his reply in full, but that they refused. Because of the importance of this issue, Variety offered to run Katzenberg's response.
Patrick Goldstein accuses me of being a modern-day Professor Harold Hill, as I trumpet the value of bringing 3-D to the multiplexes of the River Cities of the world. Unfortunately, I believe Mr. Goldstein gives me too much credit -- I'm afraid I lack Robert Preston's charisma, and I certainly don't have his hair. Nevertheless, I am willing to accept my inner Harold Hill on one condition: that Patrick Goldstein accept his inner Ned Ludd.
For those of you not familiar with Ned, he was the individual after whom the Luddite movement was named in the 19th century. Back then, the Luddites opposed industrialization and the term has come to describe anyone who resists technological change.
Of course, Mr. Goldstein insists he's "always in favor of embracing new technology." But he then goes on to proclaim that the arrival of color film represented "a disaster for quality cinema." Ned would be proud.
Now, Goldstein is condemning 3-D. Of course, he is entitled to his opinions and, to be sure, anytime something transformational comes along, there has always been and should always be a debate about its real value. And Goldstein is right when he says that, as with the transition to color, something may be lost. I just believe that much more will be gained.
My problem isn't so much with Goldstein's opinion as with the basis of his opinion.
To put it mildly, this is not the first time I've been criticized, and I'm always interested in hearing differing perspectives. But, in this case, it is hard to give Goldstein's point of view much credence since he is passing judgment on something he's never even seen.
You see, as the Harold Hill of 3-D, I have been banging the drum about DreamWorks' entry into this medium and have eagerly welcomed everyone and anyone who's interested to come over and check out what we're doing. Curiously, my phone has yet to ring with a call from Patrick Goldstein. He is willing to indict our work (as well as, implicitly, the upcoming 3-D work of such filmmakers as James Cameron, Bob Zemeckis, Steven Spielberg and Peter Jackson), but he hasn't bothered to come see it.
Instead, he seems to be basing his views entirely on his odd historical assertion that the arrival of color was "a step backward." Let me make clear that I am also a big fan of the black-and-white classics that Goldstein mentions and could happily expand on his list. But his thesis that color was the enemy of cinema art is patently absurd. From The Wizard of Oz to Pinocchio to Singin' in the Rain to Rear Window to The Searchers to Black Orpheus to Lawrence of Arabia to Orson Welles' The Immortal Ones, the range of brilliant color films is enormous and, let's face it, pretty self-evident.
The transition from black-and-white to color did not occur, as Goldstein implies, because of some studio conspiracy. It happened because, quite simply, human beings see in color. This is pretty basic stuff. I mean, would anyone want to see a stage play in black-and-white? By and large, resistance to color filmmaking wasn't artistic but economic since, for years, the cost of the three-strip Technicolor process was prohibitively high.
Now, digital 3-D has arrived and, I believe, will eventually become the standard because, quite simply, human beings see in 3-D. Again, this is pretty basic stuff. And it's also pretty breathtaking stuff. Unlike Harold Hill's imaginary band, digital 3-D is very real, enriching the filmgoing experience in truly phenomenal ways. It provides filmmakers with an entirely new visual vocabulary and it provides filmgoers the chance to finally cross the threshold of the screen and enter other worlds. This is why many of the industry's greatest directors are currently working on 3-D projects.
Initially, as with color, the economic bar for 3-D is high, so for the foreseeable future many films will continue to be produced in 2-D. But, eventually, I believe that all films will be shot in this remarkable medium.
This is how it is with technology. It moves forward. This is a constant. It is also a constant that there will be those who resist, stubbornly believing old is good and new is bad.
This perverse nostalgia is rather unproductive. Making value judgments about black-and-white vs. color is about as worthwhile an exercise as comparing the Model T to a Prius. They are both outstanding accomplishments.
As we face the future of film, the goal should not be to make derisive comparisons, but to celebrate the ongoing evolution of this art form. The extraordinary thing about movies has always been that the best of them -- whether silent, black-and-white, color, 3-D or whatever may be still over the horizon -- connect so strongly with so many of us. ... Even, were he to return one day, Ned Ludd.
By Jeffrey Katzenberg, Variety
Latin America is slowly waking up to digital cinema. While still far behind the U.S., Europe and Asia, the number of DCI-grade digital systems south of the border has tripled in the past year—from around 15 systems in the fall of 2007 to approximately 50 expected by the end of 2008. Mexico will have 16 systems, Brazil 15 and Ecuador four, with the rest in Chile, Peru and Argentina. Even the Falkland Islands, possibly the first country in the world to go 100% digital, has upgraded their single screen in Stanley with digital 3D.
3D is the primary factor driving digital conversions. Of the 50 digital systems, all but a few of the original 1.3K installations have been upgraded to 3D. The addition of 3D makes it easier for exhibitors to justify the cost, as they see immediate results at the box office. Latin American exhibitors typically increase the ticket prices by 10 to 20% over the standard, with the houses frequently selling out and the 3D titles running significantly longer than non-3D titles. For instance, it’s not uncommon for a popular 3D title, such as Disney’s Meet the Robinsons, to be on screens for a good five months in markets like São Paulo, Brazil.
In some cases, the cost of the entire digital conversion, including projectors, servers and 3D, can be recouped within a relatively short six to nine-month period. As an added bonus, some exhibitors have partnered with local third-party companies who sponsor the 3D systems in exchange for advertising and other promotional considerations.
While the conversion of a few select theatres to 3D can be justified, the conversion of the rest of Latin America’s approximately 9,000 screens from 35mm to base-level 2D digital is a far more difficult proposition. Digital titles dubbed or subtitled into the local language are still in short supply, and the audience sees little increased value with 2D digital over 35mm presentations. So far, there has been little to no support from the major U.S. film distributors in providing incentives for 35mm-to-digital upgrades.
Contributing to the past year’s growth in digital systems is the general decline of the U.S. dollar relative to Latin American currencies. In the past five years, the dollar has dropped to approximately 50% from its previous value relative to the Brazilian real, and that has helped U.S.-based companies such as Christie, Dolby and Real D establish stronger footholds in the South American market. High import restrictions, taxes and duties are still a major obstacle when building or upgrading a theatre. In some markets, the combined fees can add up to 100% to the cost of equipment, making digital upgrades prohibitively expensive. Most U.S.-based equipment manufacturers have found solutions by working with local system integrators who know the market and regulations, and frequently can ease the difficulties in getting equipment into the area. Christie Digital Systems, of Cypress, CA, has the leading position in digital projector sales. Craig Sholder, Christie’s VP of entertainment solutions, comments, “For years, Christie has worked diligently in Latin America establishing a trusted sales and support network for our 35mm projectors. Our success with digital is a natural follow-on to our tradition of supporting the local exhibitors.”
U.S.-based Cinemark International currently operates 1,030 screens in Latin America, with 11 converted over to digital 3D, using primarily Christie projectors, Doremi servers and Real D 3D equipment. To date, Cinemark has strategically placed five systems in Brazil, two in Mexico City, and one system in each of Chile, Argentina, Peru and Colombia. Valmir Fernandes, president of Cinemark International, says, “It’s hard to say how far and fast we will go, but likely will have 20 or 30 digital 3D screens installed by the end of 2009. We are getting third-party sponsorships for new 3D systems. 3D has been very good for films for kids and teens, but we are looking at the broader digital rollout and not only at 3D. Cinemark is working side by side with other exhibitors in the area to find the best solutions for the region.”
Cine Hoyts currently has 160 screens in Latin America, with 87 in Argentina, 50 in Chile, 15 in Brazil and eight in Uruguay. So far, only two—one in Santiago, Chile, and another in Buenos Aires, Argentina—are equipped for digital 3D. Cine Hoyts expects to add up to six more digital screens by mid-2009, all 3D-enabled. Heriberto Brown, Cine Hoyts general manager, notes, “We have received a lot of pressure to start deploying digital screens but are receiving little from distributors in the way of a VPF or similar incentive schemes to help us go faster. Some important titles are not distributed in South America because of the additional cost to either subtitle or dub a copy. In addition, we need more marketing support from the distributors to increase the knowledge of digital in theatres.”
Mexico is unique from the other Latin American countries in that it is driven much more by U.S. issues and trends. Cinepolis, the world’s fifth-largest cinema circuit with theatres in Mexico, Guatemala, Costa Rica, Panama, El Salvador and Colombia, and with over 91 million admissions in 2007, has made an exclusive partnership with Real D for 3D systems.
Real D’s announced rollout of 500 3D screens has begun with six new screens installed for the release of Journey to the Center of the Earth, with installations planned to continue through 2010. "Cinepolis confirms its commitment to innovation and client service by placing its confidence in 3D technology. As a company that has always been at the forefront in market innovation and new products and services, Cinepolis is proud to join forces with Real D. We believe this cutting-edge technology represents the future of the exhibition business, and with this deal Cinepolis moves ahead in the very competitive Latin American market," said Miguel Mier, chief operating officer of Cinepolis.
XpanD, also a provider of 3D technology, has made advancements in Latin America with installations in Mexico, several being planned with Box Cinemas in Brazil, as well as announcing a deal with Rain Networks, Brazil’s largest e-cinema provider. Working with Rain, XpanD is expected to deploy 100 sites in Latin America in the next two quarters and is excited about further opportunities in the Latin American market.
By Bill Mead, Film Journal International
The 2008 Cannes Palme d'Or is the 1st Digital Release to be Exclusively Distributed via an Electronic Network to Theaters in Europe
Palme D’Or winner Entre les Murs, a Laurent Cantet film released in France on September 24th, is the first film in Europe to be exclusively delivered to digital cinemas via an electronic network, thanks to SmartJog, the global leader in managed file delivery, and Cinego, a digital film release management tool.
In order to efficiently manage the digital release, Haut et Court, the French distributor of the film, selected the tool, Cinego, which was developed by CN films as part of a pilot project. As an online platform, Cinego enabled Haut et Court to start and manage deliveries of the film to cinemas, but also to supervise the creation and delivery of the Digital Cinema Packages’ (DCPs) Key Delivery Messages (KDMs).
DCP mastering has been done by Mikros Image, a french digital post-production and visual effects company.
In the past, delivering films to digitally-equipped cinemas was done by duplicating and sending hard disks. For the first time in Europe, Haut et Court decided to digitally deliver the film exclusively via a network. The SmartJog service, directly integrated with Cinego has thus allowed the electronic delivery to more than 30 screens, including the theater chains: CGR, Kinepolis, Gaumont/Pathé, and independent theaters. SmartJog operates a fully managed distribution and file transfer platform which includes the ability to send digital cinema packages (DCPs) and associated contents (keys and trailers) to digital cinemas, eliminating the time-consuming logistics typical of a release. Currently over 90% of French digital screens are connected to the SmartJog network.
This digital release is made possible by the D-Platform pilot project supported by the European Union MEDIA program. The European Union seeks to expand the digital availability and circulation of European films.
Haut et Court is distributing the film in France, while Sony Pictures Classics is handling the film distribution in the United States under the title The Class. The North American premiere of the movie is at the New York Film Festival on September 26.
SmartJog, a subsidiary of the Multimedia division of the TDF Group, is the global leader in digital delivery with presence in 65 countries, offers a fully managed distribution and file transfer platform for the secure, fast, and reliable digital delivery of content. With SmartJog, clients can digitally deliver any media, any size, to anywhere across SmartJog’s secure network. SmartJog manages the Digital Cinema offerings for the TDF Group, and has worked with the other companies in the division to develop and produce the first European digital distribution platform.
Walt Disney, Paramount, Twentieth Century Fox and Universal are soon expected to announce a long-sought $1.1 billion digital cinema deal that Hollywood hopes will boost attendance, cut costs and enable more 3-D viewing, sources close to the deal said on Thursday.
The studios declined to comment, but sources with knowledge of the talks said the deal to help co-finance the upgrade for a group of movie chains was virtually complete, with an announcement expected within days or weeks. Long delayed by debate over who should pay for the system, digital cinema offers a potential solution to declining movie attendance at a lower ongoing cost.
Fox, a unit of News Corp was the first to sign the deal this summer, but its participation was contingent on other studios agreeing. Early this month, sources said General Electric Co's Universal and Walt Disney Co had come on board and that Viacom Inc's, Cinemark Holdings Inc and AMC Entertainment Inc, who operate 14,000 screens -- to reach a deal to help finance the theater upgrades.
DCIP would not comment.
"Things are progressing well," said Dick Westerling, a spokesman for Regal, the nation's largest theater exhibitor, when asked if a deal had been reached. "Technological change is very important and we're certain this will move the needle."
DCIP was formed over a year ago and first hoped for a deal by late 2007, but talks hit snags over terms requiring studios, exhibitors and content providers to pay usage and other fees to help pay off loans provided by institutions such as JPMorgan Chase & Co to buy and install new digital equipment. The credit crunch and issues involving standards, equipment procurement and performance criteria also delayed the talks.
The upgrades will enable studios to send movies digitally to theaters, saving them billions of dollars in print and delivery costs. Once outfitted with digital projectors, theaters can add 3-D capabilities.
Hollywood has a lot riding on the conversion. Studios such as DreamWorks Animation SKG Inc and Disney plan to roll out 3-D films and need enough 3-D screens to support their slates.
"There's a lot on the line. There are 13 movies coming out industrywide in 3-D soon, so everyone's incentivized to move this along," said Michael Lewis, chief executive of RealD, a provider of 3-D systems for the cinema market.
"3-D is not just a passing trend, it is being embraced by the world's finest filmmakers," Dick Cook, chairman of the Walt Disney Studios, said at a press event this week. He said Disney will release five movies in 3-D next year, which he added is more than any other studio.
Hollywood and theater chains believe 3-D will not only boost attendance, but also command higher ticket prices.
"Our plans are to continue to charge a premium for 3-D," said Westerling at Regal, when asked if recent market turmoil would cause any change in those plans.
About 5,000 of the 37,000 cinema screens in the United States are digitally equipped and the ultimate aim is to transform all 125,000 screens worldwide.
There are around 1,300 3-D screens in the United States, primarily provided by RealD, said Lewis, who has commitments for 5,000, many of which are dependent on clinching the DCIP deal.
Assuming the deal goes through soon and the roll-out begins in earnest by December, there should be about 2,500 3-D screens in the United States by March, when DreamWorks's 3-D feature, Monsters vs. Aliens is released.
By Sue Zeidler, Reuters
TDVision stores the 3D stereoscopic perspective in such a way that the Left video stream (2D) is kept intact and stored into the video_data section of the digital video stream. The Right perspective or the "Stereo Disparity" or the "Delta" (option selected at encoding time) is stored at a different section of the video stream, therefore, ignored by the legacy 2D decoders, providing full 2D compatibility.
The TDVCodec has the flexibility to store the full right frame, the Stereo Disparity (SD) or the XOR difference (Delta) on that section to optimize the bandwidth as well as an optimized delta. When received on a TDVision decoder, it will automatically read our identifier and decode the Left frame, retrieve the Right section and provide the stereoscopic decoded info to the respective frame buffer(s) preparing it in any possible way for the display in use (checkerboard, full stereo, side by side) using the HDMI EDID/DDC.
TDVision has overcome all the drawbacks associated with previous 3D technologies:
- TDVCodec is fully compatible with 2D legacy decoders and 3D when TDVision technology is available or implemented.
- By using any of the available 3D options, the video stream can be optimized for quality and/or bandwidth.
- TDVCodec doesn't present any visual artifacts like the Side by Side and the 2D plus depth counterparts (TDVision does not present occlusion, transparency or depth limitations).
- Implementation requires just a modification on the firmware and/or our TDVCodec decoder in software.
- No loss in quality or resolution.
- Provides the best HD quality for both, 3D and 2D formats.
Once a video stream is encoded with TDVision's technology, it can be played back in a normal legacy 2D decoder since the 3D complimentary information will just be discarded at playback time, whereas a new updated decoder will retrieve the 3D information properly.
Depending on the method selected at encoding time, the Right information, Delta information or Stereo Disparity can optimize the bandwidth or the quality, providing up to full HD 3D.
There are no visual artifacts or distortions, since TDVision stores the complete stereoscopic pair and it's reconstructed and decoded as originally captured, no visual artifacts, no detriment, no invented pixels, no occlusion.
Modifying the firmware by adding our decoding method or using our software decoder is all it takes for the end user to achieve full 3D immersive experience everywhere, basically a simple download or firmware update.
TDVision's TDVCodec just requires:
– ENCODING: Encoding of the Left/Right video streams using our TDVCodec is all you need to do to the content.
– DECODING: A TDVCodec Firmware update on the set-top box/DVD/Receiver or our software decoder is all it's needed to decode simultaneously in 3D or in 2D.
On the receiver side, we can have any of the following:
– A legacy 2D hardware decoder (Satellite, cable, DVD player, set top box).
– A TDVision 3D hardware decoder (Firmware updated set-top boxes).
– A 2D software decoder (Windows Media player).
– A 3D software decoder (TDVision's DejaView or AlterSpace both using our TDVCodec).
On the visualization side, the TDVCodec can provide information to:
- 2D Legacy Monitor.
- 3D Checkerboard (for Samsung and Mitsubishi RPTVs).
- 3D stereoscopic cinema projectors or any other 3D display.
- Our TDVisor, the 3D HD 720p portable immersive multipurpose display.
All of them in full 3D HD 720p, 1080p and the best 2D resolution.
You can download the full whitepaper by TDVision for further information.
Thursday, September 25, 2008
Adlabs, the movies division of India's Reliance ADA Group, says it has become the world's first cinema chain to commercially deliver movies to its theaters by fiber optic cable. Company says it has operated more than 10,000 commercial screenings in full DCI-compliant 2K d-cinema and that more than 2,000 have been transmitted via fiber optic cables.
Movies are encoded at Adlabs' d-cinema mastering facility at Film City in Mumbai and sent over a 200Mbps connection to its Reliance ADA headquarters in Navi Mumbai. There they are sent by dedicated cable to theaters as far as Ahmedabad, Gujarat, nearly 300 miles to the north.
Reliance ADA, which on Friday became the principal investor in the next incarnation of DreamWorks, is also one of the world's top five fiber-optic cable owners through its Reliance Communications unit. Unlike other Indian groups that have backed less powerful e-cinema systems, Adlabs has invested heavily in 2K d-cinema equipment and facilities. Its Wadala plex in Mumbai was its first to convert fully to digital presentations in April.
"Adlabs is implementing globally recognized digital cinema technology standards and processes and is, at the same time, pushing the envelope by introducing distribution technologies such as OFC," said Anil Arjun, Adlabs Films CEO. "We intend to be an end-to-end world-class service provider covering mastering to delivery on site and are looking at significant expansion in this space aimed to cover 500 screens in the initial phase."
Patrick von Sychowski, chief operating officer of Adlabs Digital Cinema, said: "The Adlabs Film City complex -- which includes the Digital Cinema Mastering facility -- is the first facility in Asia to receive the prestigious international Federation Against Copyright Theft accreditation. Using fiber for end-to-end delivery, we are able to offer unparalleled picture and sound quality."
In 2005-06, Japanese telco NTT, in partnership with several Hollywood studios, experimented with fiber-optic digital delivery, but this effort did not lead to commercial use.
By Patrick Frater, Variety
Codex Digital, specialist in high-resolution media recording and workflow systems, announced the extension of its product family with the Codex Lab – a digital-film-lab-in-a-box that forms the hub of fast and efficient tapeless workflows for broadcast and motion picture productions. The new Lab is modular, offers enormous recording capacity, and provides all of the deliverables needed for production, post and archive processes. It can create a complete set deliverables in less time than traditional systems take to make a single copy.
The Lab ingests digital production material from Codex recorders and also from tape, telecine and other digital systems. It can be expanded to store over 500 hours of digital cinema footage, or 1,000 hours of high-end broadcast material, plus audio. When used in Standard Definition applications the Lab can contain 3,600 hours (around five months) of recordings.
All material stored in the Lab is immediately available, at any time, for on-demand daily deliverables and reprints. When an offline the edit is complete, the Lab automatically generates the required finishing-files from the EDL – in minutes or hours, rather than days. It will also play-out to multiple channels of video (HD or SD), with automatic processing of shot-lists or EDLs.
Ultimately the Lab is the hub of a completely tapeless workflow. It can manage a wide range of broadcast productions (multi-camera episodic TV, drama, light entertainment) or digital motion pictures. Multiple productions can be handled on a single unit. It is designed for easy integration into the MCR of an editing/VFX facility, and it can also be directly connected on set or location.
The Codex Lab converts original HD material – or any format from SD to 4K – into editing files for Avid, Apple or Adobe, into VFX, finishing and archive tapes, as well as viewing files. Output file formats include DPX, MXF, DNxHD, QuickTime, AVI, JPEG, BMP and BWF (WAV) with full metadata, resizing, colourspace conversion and LUTs for look management.
Codex has designed the Lab so that it can be configured and upgraded in a variety of ways according to the changing needs of users. It can be ordered with one or two dual bays for Codex Portable diskpacks, and a dual bay for Codex Recorder diskpacks. The Lab will also hold up to four internal LTO4 tapedrives, or can control external LTO4 robots.
High-speed RAID6 storage is expandable to over 100TB in removable blocks of 12 or 24TB. This provides 500 hours of HD 4:4:4/24fps cinema-quality compressed, or 125 hrs uncompressed, 1,000 hours of HD 4:2:2/24fps (high broadcast-quality), or 3,600 hrs of SD PAL/25fps.
The Lab offloads HD footage up to ten times faster than realtime, with SD over twenty times faster, and it can produce multiple deliverables in parallel – underlining the ability of all Codex workflows to deliver unrivalled efficiency, productivity and creativity.
Codex Digital expects to ship the first production-ready units in the first quarter of 2009, and will publish pricing accordingly.
Source: PostProduction Buyers' Guide
Panasonic has developed the world's first 3D full HD Plasma Theater System, which enables the viewing of true-to-life 3D images by using a 103-inch plasma television and a Blu-ray Disc (BD) player, distributing full high-definition (HD) (1920 x 1080 pixels) images to left eye and right eye. Panasonic will present this system at CEATEC JAPAN 2008, which is due to be held at Makuhari Messe from September 30 to October 4, 2008.
Human beings feel the 3D impression because each of the left and right eyes recognizes different images. Panasonic's system comprises a 103-inch plasma television and a BD player that plays back BD onto which 3D images, consisting of left- and right-sided 1080p full HD images, are recorded. By wearing active shutter glasses that work in synchronization with the plasma television, the viewer is able to experience 3D images formed with twice the volume of information as regular full HD images, and enjoy them together with high quality surround sound. This system enables full HD signal processing on each of the left and right images in every process -- recording, playback and display.
Previous consumer 3D display systems have encountered many different problems, including reduced vertical resolution caused by a 3D display method that divides the scanning lines between the left and right eyes, and picture quality degradation caused by pixel skipping that results from the squeezing of two (left and right) screens' worth of full HD images into one screen's worth of data capacity for image storage and transmission. Until now, there has not been a system capable of displaying the equivalent quality to original master of Hollywood 3D movies.
Panasonic has developed the following technologies for realizing the new system:
Plasma display: The performance of Panasonic's plasma panels, whose self-illumination allows for excellent video response, has been brought out to the fullest extent in the development of a 3D driving system that displays the left and right images together as full HD images.
BD: Using the optical disc technology cultivated by Panasonic over many years, and the authoring technology developed by Panasonic Hollywood Laboratory (PHL), it has been possible to record 3D images -- consisting of respective left and right 1080p full HD images -- onto a single, standard BD.
BD player: Panasonic has developed a technology to decode and play back the left and right full HD image data recorded to the BD in real time.
3D images: Panasonic has produced 3D contents that allow the viewers to experience fascinating 3D images, including dynamic images of athletes at the Olympic Games, and animated movies by Hollywood. These contents will be shown in a special theater set up in the Panasonic booth in Hall 3 at CEATEC JAPAN 2008.
Panasonic will work to promote the 3D system by standardization of 3D format at Blu-ray Disc Association (BDA), with the cooperation of the Hollywood studios and consumer electric companies which are members of BDA, in order to allow consumers to enjoy 3D images in the comfort of their own homes.
If nothing else, image acquisition tools are now as diverse as the people that use them. At IBC all of the major camera manufacturers, and a few non-traditional ones, continued to advance their optic systems and add new features, while (in some cases) reducing prices.
The Arri Arriflex D-21, a film-style digital camera, was shown with significant improvements to the image-processing engine to provide improved image quality. New output options include a 2K raw data output mode and the use of anamorphic lenses.
Based on Arri Imaging Technology (AIT), the D-21 produces outstanding images with a single Super-35-sized CMOS sensor and the same lenses as 35mm film cameras. It also has cinematic depth of field and can be used with anamorphic lenses. The camera includes a bright optical viewfinder and is capable of acquiring mages at variable frame rates. In addition, the D-21 outputs either the raw sensor data for a 2K workflow or a standard HD signal.
Ikonoskop, the Swedish maker of the A-cam SP-16 lightweight 16mm film camera, introduced a new digital motion picture camera that moves it into direct competition with world’s best high-definition video cameras.
The A-cam dII is the only electronic camera that offers uncompressed images in the high-definition RAW format. Rather than the camera, the computer does the image processing in post. When recording in 1920 x 1080 pixels from one to 60 RAW frames per second, users can master directly to HDCAM, HDCAM SR or any other full HD format. To print the video to film, the native 1.78:1 aspect ratio can be used without any loss of pixels. Ikonoskop calls this “WYSIWYG-HD.”
The A-cam dII records at 240 MBps to an 80 GB memory cartridge developed by the manufacturer. A single cartridge holds about 15 minutes of video, audio and metadata.
The camera’s Super-16-sized CCD sensor works with all Super-16mm lenses. This enables use of a wide range of cine and prime lenses. As one of the few camera manufacturers in the world to support all the de-facto industry standard mounts, Leica M, PL, IMS and C-mount lenses can also be used.
The camera is extremely lightweight—3.3 pounds—and costs under $10,000 including lens and 80 GB flash memory card. It ships with Ikonoskop’s 9mm, f/1.5 wide-angle cine lens. Ikonoskop is now taking pre-orders for delivery by the end of the year.
Thomson introduced a new “Elite” series of LDK HD cameras. There are three new models: the Thomson Grass Valley LDK 4000 Elite, LDK 8000 Elite (in Standard and WorldCam versions), and the LDK 8000 SportElite. All offer new levels of camera performance while opening the product up to a wider production community. Thomson also added 24p capability to its Thomson Grass Valley Infinity digital media camcorder.
Among the new enhancements, the LDK Elite and SportElite HD series significantly improve on the existing cameras’ internal digital signal processing circuitry by incorporating all-new DSP circuits. It’s a completely updated software platform that performs all camera image management functions – such as knee, gamma, contours, and variable matrix – with three channel 22-bit digital precision. The SportElite adds 2x Super Slow-Motion sampling functionality in both 720p and 1080i HD formats.
A newly designed chipset used across the entire Elite series contains additional features, such as digital cosmetics – sometimes known as negative skin contours – with independent dual-skin tone selection. Extensive colorimetry and color-matching tools are also available, giving users the choice of performing matrix processing before or after gamma.
Among the other majors, Panasonic didn't exhibit on the show floor, but did host a digital cinematography session where the P2 solid-state versions of its Varicam (the AJ-HPX2700 and 3700) were discussed as along with the company’s AVC Intra codec and its AJ-HPX3000 camera. Support for AVC-Intra compression across the product lines of Autodesk and Avid Technology was announced at the show.
Sony introduced the PMW-EX3 XDCAM EX solid-state (SxS cards) camcorder to Europe, with its interchangeable lens. The company also added a new compact “Hybrid-ready” HDV camcorder to its portfolio: the HVR-Z5E, which borrows from the existing HVR-Z1E and DSR-PD170P camcorders to offer excellent low-light performance, a new 20x “G” lens and hybrid workflow with the optional HVR-MRC1K solid-state memory recorder. The lightweight unit can be mounted on a HDV camcorder and provide users with a hybrid recording system that can record to CompactFlash solid-state memory and tape simultaneously. A single 16 GB CF card can record about 72 minutes of HD material at 25 Mbps.
By Michael Grotticelli, StudioDaily
Codex Digital, specialist in high-resolution media recording and workflow systems, continues to shrink the world of digital cinematography for broadcast and motion picture production with the announcement of its new On-Camera solid-state recording system. The On-Camera, set to ship in Spring 2009, is packed with many of the same advanced features as the ground-breaking Codex Portable. It is designed to clip on to HD, 2K and 4K cameras, giving the camera crew the freedom to move anywhere around a set or location with enormous recording capacity.
The On-Camera can record over two hours of 2K cinema footage, or over four hours of high-end broadcast material, as well as audio and metadata. As with all other Codex recording systems, and unlike comparable products, the On-Camera enables true tapeless production. It offloads footage up to ten times faster than realtime, in file-formats ready for editing and compositing – underlining the ability of Codex workflows to deliver unrivalled efficiency, productivity and creativity.
Weather-resistant, rugged and weighing less than 3kg, the On-Camera is designed for ‘run and gun’ production – around a set, up a mountain or in a field – on the full range of broadcast drama, episodic TV and light entertainment, as well as digital motion pictures.
The Codex On-Camera is always ready-to-record and runs off all standard camera batteries (12-28V). Flexible I/O configurations mean the On-Camera can record from virtually every digital camera available today – including all HD cameras in video mode, plus data-mode from cameras such as the ARRI D-21 and DALSA’s Origin, and others as their data ports become available.
The On-Camera also features a “Mutter Track” microphone input, which allows the user to add comments during a take for shot logging and notes.
In production, the On-Camera provides immediate playback and review of footage on its own daylight-readable screen, or remotely over a wireless network. It delivers DPX and MP4 files directly, and in conjunction with the Codex Transfer Station or Codex Lab, can deliver shots in all industry-standard formats, including DPX, MXF, DNxHD, QuickTime, AVI, JPEG, BMP, and BWF (WAV) files. It can even provide native-mode files that editing-systems can use with no importing at all.
Codex Digital expects to ship the first production-ready units in Spring 2009, and will announce pricing at that time.
The broadcasting industry continues to shift towards high definition, increasing television channels' transmission standards and operating procedures. The digital process from start to finish, which includes encoding, sending, and airing must transition from standard definition (SD) to HD file formats. New channels are launched most commonly on HD platforms, receiving all content in HD format and, if necessary, down-converting to SD.
As broadcasters add HD channels to their line ups, a delivery platform is needed that can handle the movement of HD-quality files. The industry is working to identify a short list of codecs and containers that are acceptable for the broadcasters' entire workflow - from editing systems to video servers and long term archiving systems in order to try to service content for HD channels through digital delivery. The broadcasters that are the first to broadcast in HD will be the ones who begin standardising these specifications. This same process occurred when MXF emerged as the standard for SD broadcast.
Broadcasters need a way to receive HD files from content distributors, however not all digital delivery methods are capable of moving HD media. Physical tape shipment and FTP technology have become outdated as new delivery methods are available to meet servicing needs, both in terms of efficiency and cost.
Since SmartJog's inception, it has been a file-agnostic platform so that any type of content can be sent and received across its network, regardless of file size or format. As broadcasters make the transition from SD to HD, SmartJog's existing technology and hardware can be immediately leveraged to allow the movement of HD media for direct ingest into a HD broadcast environment.
HD digital delivery with SmartJog can be seamlessly integrated into broadcasters' current technology. One common workflow begins with a 'ready-to-air' source file that is compatible with video server ingest, such as MXF MPEG-2 @ 50Mbps. Once file delivery has completed, the SmartJog server can also be used to transcode to various formats such as DNxHD or DVCPro100 to facilitate specific editing needs. The integration and automation features that the SmartJog service provides, along with its acceleration technology available over both terrestrial and satellite connectivity, make it a clearly differentiated product when compared against traditional file delivery systems.
Source: IBC Daily
3ality Digital is a pioneer of modern stereoscopic production, with projects to its credit such as the critically-acclaimed U2 3D. The 3ality name is known by anyone involved in modern digital stereoscopic content creation. The 3ality team have years of experience in digital production of stereoscopic films, stereoscopic broadcasting and many other kinds of stereoscopic content. They have also designed and built camera systems and state-of-the-art electronics for production, post production, broadcasting and exhibition.
3ality has now used that image correction expertise to create the 3flex SIP2100 – a system for post production that can analyse stereo content, correct faults and help create technically perfect deliverables. The SIP2100 has tools for handling geometric, color, dimensonalisation and sync issues that can occur in newly created or legacy stereo content. Working together with systems like Quantel's Pablo, iQ, Sid or sQ, users can quickly and easily identify and fix problems, make artistic decisions and create versions suitable for different delivery requirements, in a fraction of the time that might otherwise be required.
At the heart of the 3flex SIP2100 are sophisticated algorithms running on powerful hardware which can analyse and correct real time stereo HD-SDI signals without any rendering whatsoever. The 3flex SIP2100 effectively works in 'play mode' which allows signals to be processed 'on-the-fly'. This means the 3flex SIP2100 can be connected before or after a Quantel or other device and there is no internal rendering to disc required.
This makes the 3flex SIP21000 suitable for many uses:
- Shot selection of rushes before or during off-line editing
- Pre-correction of rushes before the online conform session
- Analysis or correction during online color correction, compositing or finishing
- Final analysis, correction or special formatting during playout and mastering
- Analysis correction or special formatting in device to device dubbing (e.g. VTR to VTR)
- Quality control point for Stereo CGI workgroups
- Tool to aid stereo restoration projects
- General monitoring material quality at any point during a project
As the 3flex SIP2100 operates in a fully live mode it can be used anywhere in an HDSDI signal chain throughout a Post house, including edit suites, color correction suites, machine rooms or viewing theatres – it is a major technology step forward for Post – but the story doesn't stop there.
Tuesday, September 23, 2008
Not only was the BBC's core operation at the Beijing Olympics fully HD and tapeless for the first time, its transmission was enabled by BBC-devised compression technology Dirac. The use of Dirac, under development by engineers from BBC Research & Innovation for several years, represented the first broadcast application of the technology.
"Dirac has now reached a degree of maturity," reports Tim Borer, who heads the BBC R&I Dirac team. "Dirac is a whole family of video codecs derived from the same technology. Dirac Pro is designed for professional links within and between studios; other versions of Dirac are aimed at contribution links and for end user distribution."
Host telecommunications from the venues was largely uncompressed, but for some venues the BBC was unable to secure HD links. To solve this problem it fitted Dirac Pro 270 encoders, manufactured by NuMedia Technology, to compresses the signals over SD SDI to Beijing's IBC. "Alternative compression schemes don't travel over standard SD links and have higher latency," he adds.
"Dirac's compression is at least comparable with H.264 (MPEG-4 AVC) but it has two significant advantages," says Borer. "These are that it is royalty free as well as being open source and cross platform."
BBC R&I has developed a special software version of the algorithm to compress the Super Hi-Vision (SHV) broadcast format proposed by NHK and being demonstrated by the Japanese broadcaster, EBU, RAI and the BBC at IBC. "NHK is looking to develop Dirac hardware for encoding SHV," says Borer. "The native bitrate from the SHV camera is 24 Gigabit a second and Dirac is one option for this since it provides a more graceful compression than MPEG. The wavelet compression on which Dirac is based fits well with SHV since wavelets scale well for high-resolutions.
The quality of Dirac's compression continues to improve above 40Mbps whereas H.264's quality begins to deteriorate at that point. "Often the BBC wants to use contribution links with bitrates as high as 80Mbps and currently the only option for that is MPEG-2. Dirac now provides another option."
There are concrete plans to implement Dirac Pro 270 within the central switching area of TV Centre.
By Adrian Pennington, IBC Daily
Manzanita Systems is demonstrating its MP2TSAE 4.0 - a major upgrade to its MPEG-2 Transport Stream Analyzer Enhanced Version software, available shortly. This new release includes IP stream analysis, detects DVB subtitles and teletext, and identifies the presence of closed captioning in H.264 video. SCTE 35 splice point analysis has been added as well as additional audio formats and improved H.264 video elementary stream support.
MP2TSAE has a command line and a GUI interface, and performs an exhaustive analysis on transport streams, verifying their compliance with MPEG-2, DVB, ATSC, or CableLabs standards. The enhanced analyser can detect and log over 500 different types of warning and error conditions. The included MBatch utility can monitor a 'watch' folder and analyse any newly added transport stream files, or it can automatically analyse all files in a selected folder.
Manzanita has also recently started shipping SPMux 4.0, a major upgrade to its System and Program Stream Multiplexer software. The latest release of the multiplexer includes support for private_stream_2, an enhanced demultiplexer utility, Microsoft Windows Vista operating system support, and an improved GUI.
Manzanita Systems has also announced MP2TSMS 5.0, an upgrade to its MPEG-2 Transport Stream Multiplexer for Single Programmes software. MP2TSMS 5.0 includes improved support for audio elementary streams, an enhanced demultiplexer utility, and Microsoft Windows Vista operating system support.
MP2TSMS utilises Manzanita's multiplexer engine to produce professional grade, single programme transport stream files. The software multiplexer supports MPEG-1 and MPEG-2 video stream formats. It also supports the following audio formats: MPEG, DTS, Dolby AC-3, Dolby Digital Plus (E-AC-3), AAC, and the newest addition, HE-AAC v1 and v2. Default multiplexing configurations for compliance with the CableLabs VoD SD and HD Content Specification are also now included.
In addition to supporting HE-AAC v1 and v2 elementary streams formats, MP2TSMS further improves the support of several other audio formats. The multiplexer now accepts low sampling rates for MPEG audio layers 1, 2, and 3. Additionally, support for multiple sub-streams and small block frames in Dolby Digital Plus audio elementary streams has been added.
Manzanita Systems has also announced MP2TSMM 5.0, an upgrade to its MPEG-2 Transport Stream Multiplexer for Multiple Programmes software. The latest release of the multiplexer includes expanded audio elementary stream support, an improved demultiplexer utility, the MBatch scripting utility, enhanced remultiplexing, and Microsoft Windows Vista operating system support.
By Heather McLean, IBC Daily
Marquis is demonstrating Medway version 2.3's enhanced support for HD formats to provide provide broadcasters and content producers with more flexibility when migrating their workflows from SD to HD processes.
The new formats supported include DNxHD, MPEG-2 HD long GOP, MPEG-2 HD I frame, DVCPRO-100 HD and Sony XDCAM HD 50 422, and additional flexibility for bidirectional transcoding from DNxHD to either MPEG-2 long GOP or MPEG-2 I frame. Marquis says Medway V2.3 will feature an extended, highly sophisticated set of metadata handling capabilities that will enable operators to achieve more tightly integrated media transfers between a broader range of broadcasting, post production and back office administrative systems.
By Ken Kerschbaumer, IBC Daily
Dolby Media Meter is a software tool for measuring loudness in programming for broadcast, packaged media, VOD, and games.
Differences in audio levels between programs and channels or between programs and commercials are a major annoyance to TV viewers. While obvious to the viewer, these loudness differences have proven difficult to measure with conventional methods and equipment.
With Dolby Media Meter, Dolby adds to its lineup of unique and innovative tools that accurately and objectively measure loudness as viewers subjectively experience it. Dolby Media Meter features Dialogue Intelligence technology, adopted from the award-winning Dolby DP600 Program Optimizer, that automatically detects and then measures loudness only during the presence of speech in an audio track. Dolby Media Meter measures loudness using the ITU-R BS.1770-1 algorithm, and measurements can be done either with or without Dialogue Intelligence.
Dolby Media Meter runs as a Mac or Windows stand-alone application, as a Digidesign Pro Tools AudioSuite or RTAS plug-in, and as a plug-in for Minnetonka AudioTools AWE. Dolby Media Meter supports measurement of Dolby Digital, Dolby Digital Plus, Dolby TrueHD, Dolby E, and PCM audio formats. All versions of Dolby Media Meter can produce and save log files.
Dolby Media Meter is a faster-than-real-time, file-based measurement application. This is useful for DVD and Blu-ray Disc mastering, and for program creation and quality control applications in audio production, postproduction, and broadcast facilities.
As a Pro Tools RTAS plug-in, Dolby Media Meter measures loudness in real time, so that users can track levels during the mixing process to help meet network delivery requirements. The RTAS plug-in can simultaneously display short- and long-term loudness levels that are based on the level of dialogue in the mix.
Pricing: $795 (MSRP)
The good news: Experts say the enabling technologies are sufficiently mature to bring stereoscopic 3DTV to the home.
The bad news: There are at least five content-encoding formats and more than 20 display technologies contending for the lead role.
"They all have their strengths and tradeoffs," said Chris Chinnock, president of Insight Media, which tracks the various display approaches. "The goal is to find something that works with the existing 2-D infrastructure" of production, distribution and display systems, he added. "Some are better for broadcast and others are better for TVs and set-tops," said Chinnock. "How to sort through them all is a little unclear to everyone at this point."
The Society of Motion Picture and Television Engineers has started a process to explore the content-encoding and format approaches. There are five such approaches in play, according to Wendy Aylsworth, vice president of engineering for SMPTE.
Philips' approach, called "auto-stereoscopic 3-D," involves adding metadata to 2-D content, and is the only one of what experts consider the leading methods that includes an option that doesn't require users to wear special glasses.
Other approaches are promoted by less well known companies--DDD Group plc, Real D Cinema, TDVision Systems and Sensio Technologies. All five planned to demo their technologies at an Aug. 19 SMPTE meeting on 3DTV. Dolby Labs and Texas Instruments also planned demos there.
Real D is getting traction in theaters, with deals to serve as many as 5,000 screens in the pipeline, but has yet to announce its direction for home products.
TDVision says its approach can be handled without any new hardware, is compatible with the MPEG infrastructure and can create a single content file that can serve both 2-D and 3-D uses.
Sensio's spatial compression algorithm requires hardware to handle decoding or real-time encoding. However, the company claims the hardware, now in a Xilinx Spartan III FPGA, will not require many transistors or much memory when it is integrated as a block into existing system chips. It showed a prototype 3DTV from SpectronIQ at the Consumer Electronics Show in January, has deals to be used in 50 movie theaters and has 16 movies encoded in its format.
Insight Media has categorized the many display technologies as a tree with three major branches. One branch includes approaches that separate the pixels of the images seen by the left and right eyes in time, another separates the pixels in space and a third places pixels in the same time and space. Most approaches require viewers to wear either active or passive glasses (passive glasses need no electronics or mechanical shutters), but a handful--including approaches demoed by LG, NEC, Philips, Samsung and Toshiba--require no glasses.
The so-called auto-stereoscopic (no-glasses) approaches typically split a large screen HDTV into as many as 20 zones to serve viewers sitting at different angles or distances from the display. The result is a high-def image sliced into multiple VGA-class images.
"They have to trade off a lot of image quality--it's pretty bad," said Chinnock. Long-term glasses-free approaches are a Holy Grail for the industry, but in the short term "it won't work for home movies, but it might be OK for ads or logos," he added.
3DTV will be a niche in the home if it requires special glasses, predicted Eric Kim, general manager of Intel's digital home group. Still, Intel has partnered with Dreamworks on tools that could enhance 3-D cinema and claims it will drive the technology into home and mobile products eventually.
There's still room for improvement. Experts know stereo 3-D can cause viewer eye strain and headaches if the content does not properly handle aspects such as brightness and the convergence of left and right images. However, the industry currently lacks metrics for stereoscopic image quality.
3DTV will force the need for new techniques--and maybe standards--in blocking, scripting, shooting and editing movies and live-action content, said Chinnock. It's a new dimension on many levels.
By Rick Merritt, EE Times
A stable version of the dirac-research codebase, Dirac 1.0.0, has been released. The release tar-ball can be downloaded from here or here.
Changes is this release are:
- Compliance with Dirac Bytestream specification 2.2.2.
- Adaptive GOP structure.
- Improved motion estimation.
- Improved pre-filtering.
- Major code refactor of encoder classes.
- Added conversion utility for horizontal 3/4 filtering.
- DirectShow Filter released to be able to play back Dirac v2.2.2 files raw bytestreams and Dirac wrapped in AVI in Windows Media Player and MPlayer Classic.
Monday, September 22, 2008
Labels: IT Broadcast
Fujifilm Corporation has announced a radical departure from current imaging systems with the development of a completely new, real image system (3D digital camera, 3D digital photo frame, 3D print) that marks a complete break from previous attempts to introduce this technology.
Previous 3D systems were hampered by poor image quality, and a cumbersome user experience, which often meant the need for special 3D glasses. One major benefit of the FinePix Real 3D System is that for digital camera LCD playback, display and print, the consumer can enjoy the image just as it was originally seen with the naked eye
The same research team is determined to use these key technologies to open up a new market with 3D imaging. The new 3D image system features advanced image signal processing and micro-component technologies, and is so far able to demonstrate a camera, a viewing panel and a 3D printing system.
The technology behind the 3D camera
The 3D camera depends heavily on a newly developed chip called the “Real Photo Processor 3D” which synchronizes the data passed to it by both sensors, and instantaneously blends the information into a single high quality image, for both stills and movies.
“Built-in 3D auto” determines optimal shooting conditions from both sensors. 3D auto means that as soon as the shutter is depressed, key metrics for the image, such as focus, zoom range, exposure, etc, are synchronized. The camera is also fitted with built-in synchro control, giving 0.001-second precision for shutter control and movie synchronization.
The processor uses the very latest technologies of high sensitivity and high resolution as the newest 2D processors. Special identical high quality compact Fujinon lenses have been developed for the 3D system to ensure complete conformity between the left and right images.
The LCD monitor system has also been completely revised. The camera is fitted with a 2.8-inch, 230,000 pixel LCD. Thanks to a new engineering approach, screen flickering and image deterioration, thought to be difficult to overcome, are reduced to an absolute minimum to achieve beautiful, natural 3D images. The screen will also resolve 2D images as any other camera LCD.
Viewing with the FinePix Real 3D System
A new 8.4-inch, “FinePix Real 3D Photo Frame” with over 920,000 pixels has also been developed. The LCD monitor on the camera and the stand alone display panel share similar technologies which solve the problem of screen flickering and image ghosting, common problems with earlier developments, giving crisp, high resolution viewing of images in glorious 3D or standard 2D.
A newly developed “light direction control module” in the back of the LCD controls light to right eye and left eye direction. This light direction control system enables easy and high quality 3D viewing without special 3D glasses.
Printing with the FinePix Real 3D System
Using know-how gained through years of development of Frontier, Fujifilm has developed a 3D printing system using a fine pitch lenticular sheet giving high-precision, and fine quality multiple viewpoint 3D like never before.
Shooting with FinePix Real 3D System (future possibilities)
FinePix Real 3D System is also paving the way for new possibilities in 2D photo enjoyment. The heart of the system is a new concept camera fitted with dual lenses. Each lens can capture stills or movies from a slightly different position, producing the basis of the 3D image. By combining new dual lens system, new functions can be achieved, for example, image quality improvement function (Simultaneous Dual-Image Shooting: Multi- Expression). For users, this is just one possibility from a dual lens camera.
Other fascinating possibilities include:
The IBC master class "A production language for 3D" highlighted numerous tutorial elements, but at its heart was producer Phil Streather's assertion: "3D is an art, not an absolute science." Pointing out that there have only been nine digital 3D movies released thus far, he cited the increasing revenues as the sign that 3D is a fast track market.
Streather fronted the session, working to a script involving Pablo Post owner Ralston Humble and independent filmmaker Celine Tricart, and summarised what he had wanted to achieve: "In order to make comfortable 3D when filming closer than eight feet, you need to use a mirror rig.
"It is important to look at what 3ality is up to for the way it is selling its IP, and the way the rig companies in general are working on the scalability of rigs," he added. "And the real things that cause fatigue are mis-alignment and if the backgrounds are too far apart."
Quoting RealD's technical guru Lenny Lipton he said: "Good 3D is not just about setting a good background. You need to pay good attention to the seven monocular cues - aerial perspective, inter position, light and shade, relative size, texture gradients, perspective and motion parallax. Artists have used the first five of those cues for centuries.
"The final stage is depth balancing," he added. "But once you have done that you may end up with objects breaking the frame." Along with dozens of others, this problem was resolved, in this case by the use of an opaque mask.
Asked what he saw as important, Humble said: "With all the toolsets for post and the new rigs for sale, the cost of entry into 3D has been reduced. At the moment it is a collaborative movement with everyone working together in order to go forward, but it is up to the big post houses now. If they make it expensive or a pain to finish, no-one is going to do it."
By George Jarrett, IBC Daily
SMPTE President Robert Kisor has emphasised a need for a standard distribution file format for 3D content viewed in the home. This is a prominent issue on the SMPTE agenda. During the summer, the standards-setting body established a 3D Home Display Formats Task Force to define the parameters of a stereoscopic mastering standard that will enable 3D feature films and other programming to be played on all fixed devices in the home, no matter the delivery channel.
"The goal is to get to a single distribution master, just like we are trying to finalise in digital cinema," said Kisor, VP, engineering and technical services at Paramount Pictures, who likened the situation to the former Blu-ray/HD DVD issue. "Do you do one? Do you do both? Now look at it in 3D, where there are maybe four different formats. That fragments the market even more ... You don't want to have the consumer trying to sort out different copies of something to figure out which one plays on their particular combination of hardware.
"That's why it is important to get CEA to buy into the process," Kisor said. "If the technology side for file formats and interfaces can blend with the CEA development of how the systems work, then there is a good chance a consumer-friendly system can be generated through the task force."
Source: IBC Daily
The ARRIQCP (ARRI Quality Control Player) is a DCP (Digital Cinema Package) QC-Server designed for use in Digital Cinema content production workflows and offers quality assurance as well as analysis of the Digital Cinema mastering process. It is designed for facilities creating Digital Cinema content and will bring security to the compliance of DCPs. Besides playback functionality, the ARRIQCP performs an overall check of encoding, packaging and DCP validation, on top of file-integrity checks and content-analysing tasks.
- Play compositions and/or single track files with precise transport control.
- Remote control via TCP/IP and RS-422.
- Test JPEG Interop and SMPTE DCP + KDM for standards compliance.
- View detailed DCP metadata.
- Edit CPL Entry Point, duration values and CPL markers.
- Unencrypted HD-SDI/DVI-D interface with visible/invisible watermark for convenient machine room monitoring.
- Immediate playback without ingest for "emergency screening".
- Capture and export timeline locators with reviewer notes.
- Job Management for long-running analysis tasks.
Monday, September 22, 2008
Codex Digital has announced support for recording Arri's new Arriraw T-Link raw data recording system, as offered on its latest D-21 digital film camera.
It means that Codex Digital's data-recording (uncompressed or JPEG2000) will enable users to record the highest-possible resolution and dynamic range of the D-21 and future Arri cameras, and to deliver the captured material directly into the post production chain. The material is recorded complete with all available metadata, and delivered in any format required for effects and finishing.
The raw, unprocessed data from Arri's 4:3 sensor makes maximum use of its 2880 x 2160 resolution (around 3k) and 12-bit bit-depth, and the Codex can offer realtime playback, including unsqueezed anamorphic material. This gives greater picture resolution than when using the 1920 x 1080 HD video output, as the entire image data captured by the sensor is retained. Codex can also de-mosaic and downsample the Arri cameras' Bayer pattern images in realtime. Material captured with the D-20 or D-21 can be viewed and passed directly to editorial in the form of MXF, QuickTime, AVI or DPX files, without the need for lengthy de-Bayering and downconversion. In Arri data-mode, Codex also supports all framerates, including full vari-speed support.
by David Fox, IBC Daily
Digital cinema faces competition from viewers at home with their plasma displays before 3D screenings have even opened. Bill Foster, senior technology consultant, Futuresource Consulting, says while all the focus has been on the cinema, studios have a need to develop revenues from other sources.
"The home video, pay TV and even free-to-air revenue is even greater than that from the box office," says Foster. "If you take a multimillion dollar film, such as the forthcoming Avatar, if you're limited to all the 3D costs coming out of the cinema, and with the proportion of 3D even within digital cinema, Hollywood needs the extra revenues."
"We need to have a standardised method of packaging and distributing 3D content," says Foster. "We need a standardised interface so that if HDMI is going to carry a stereoscopic image we need to know how it will be carried and how an ordinary TV going to know what to do with it."
Foster says the solution could be to ensure that HDMI has an additional flag to signal the fact it is a stereoscopic image. "If you've got one HDMI input on your TV you can pull out your Sky box and put it a Blu-Ray player and it will work. The industry needs a standardised interchange, it doesn't mean that everybody is going to shoot 3D in the same way and it doesn't mean everybody is going to display 3D in the same way but the bits in between have got to be standardised."
By Julian Clover, IBC Daily
Ateme has launched its new Kyrion Tape to File system, the KTF 7100. Designed for the post market, it shares the same high quality H.264 encoding engine as used by the Kyrion File Encoder. The KTF 7100 works with all professional video tape decks and can be used for providing Dailies and VoD content compatible with set-top boxes.
Fitting in a single 1RU rack module, it includes embedded storage (160 GB), 2 Ethernet ports, 1 USB port and a SDI video/audio connection, and is ideal for anyone looking to quickly ingest video tapes and Blu-ray discs into a file-based system. The KTF 7100 supports SD or HD video from QCIF to 1080p in PAL and NTSC. It features real-time quality monitoring and a built-in web based configuration tool. Add in the Kyrion File Encoder (KFE 2.0), and you have what the company bills as a complete solution for building a file-based workflow.
The company is also releasing its third generation MPEG-4 AVC/H.264 encoding engine, which now features even greater bandwidth efficiency - up to 30% gain - when compared to the previous generation while providing the very same level of picture quality. It also includes support for the 4:2:2 profile, higher bitrates and ultra low latency at below 500 milliseconds.
By Andy Stout, IBC Daily
Monday, September 22, 2008
Labels: IT Broadcast
Ascent Media Viia's suite of file-based media services enables content to be digitised, stored, re-purposed and distributed globally in multiple formats and languages. Charter clients include Sony Pictures Entertainment, Paramount Pictures and BBC Worldwide. To date, Viia has processed more than 90,000 assets, delivered file-based titles to more than 50 broadcast, broadband, and mobile distributors worldwide, and archived more than 9,000 titles.
Developed by Ascent Media, and delivered through a strategic partnership with HP, Viia is a state-of-the art suite of technologies and services used by studios, content providers, networks and aggregators, that combines managed content libraries with an advanced repurposing engine and global distribution capabilities to get your content to any channel you want to pursue in an efficient and cost effective basis.
Viia provides complete services to ingest and encode new and existing content including full finished titles, production elements, and metadata, into a central repository you can access and manage from anywhere in the world.
When opportunity calls, your content will answer. Our industry-leading talent uses the best platform of technologies and services to transform and package your assets on-demand to meet any set of requirements and specifications—for every device, every channel, every market.
Don’t keep your customers waiting. From same-day file delivery anywhere in the world to real-time satellite and fiber transmission, we'll get your content where it needs to be - safely, reliably, and cost-effectively.
- Library management
- Content versioning
- Standards Conversion
- Foreign Language Audio Layback
- Digital rights management (DRM)
- Encryption and watermarking
- Metadata management
- Electronic fulfillment
- Performance tracking
- Global File Transport
- Real and non-real time distribution
- Content store-and-forward
- Audio and video decoding to physical media
- SD and HD fiber transmissions
- Satellite transmission and space segment booking
Cine-tal Systems is displaying its line of display, image processing and colour management products. The collection includes Davio, a portable HD video and DI image processing system; and the latest versions of Cinemage monitors and eL 1000 image processors.
Additionally, Cine-tal is showcasing cineSpace colour management technology that it acquired recently from Rising Sun Research. The company will demonstrate how it has adapted cineSpace to support display profiling and LUT generation matching with its display and image processing systems.
Cinemage display systems are designed for digital cinema acquisition, post production and DI monitoring. They combine the company's Intelligent Display Server (IDS) with a calibrated, full-resolution LCD display. Cinemage provides video analysis, color pre-visualisation, video signal quality assurance, and an integrated Omnitek dual link waveform monitor and vectorscope.
Davio is a smaller, lower-cost alternative to the company's eL 1000 processor. The device is configurable for production, post production and broadcast workflows. Multiple Davios can be combined to handle complex tasks or to keep pace with growing demands.
The Davio software library provides tools for:
- Display calibration & emulation
- Color pipeline management
- 3D stereo processing
- Still Store
- Frame markers, graticules and cages
The eL 1000 provides an open architecture color processing system that can calibrate output devices, process color space conversions and manipulate color for purposes of workflow calibration, color gamut mapping and color pre-visualization.
cineSpace is designed for production and post production facilities to ensure that all of their displays match and conform to a specified output medium, such as a particular film stock or video format.
Dolby 3D Color Processor
The Dolby 3D Color Processor, manufactured by Cine-tal, performs accurate left eye / right eye color balancing for color grading suites and screening rooms using the Dolby 3D Cinema Color Filter for digital projection. The Dolby 3D Color Processor provides the same color processing as Dolby's Digital Cinema Show Player used in the exhibition theater. With the Dolby 3D Color Processor production and post production professionals are guaranteed the same color accuracy when 3D material is played back from any source or server and projected with a Dolby 3D Cinema Color Filter.
By Carolyn Giardina, IBC Daily
The new CineMonitorHD6 3DView location monitor from Transvideo is designed to allow 3D viewing of feeds from two genlocked HD-SDI cameras on location. The six-inch monitor is based on Transvideo's SuperBright 1000 Nits monitor, with an added processing system to correlate the two HD-SDI feeds. An anaglyph mode (using two different colours) gives the operator a realistic preview of the 3D picture.
Source: IBC Daily
Ikegami has a prototype flat panel master monitor on show that it believes will offer the picture quality of a Grade 1 CRT, without its disadvantages. The field emission display incorporates real phosphors, "so black is black and colourimetry is correct, just like a CRT, but unlike the CRT it doesn't have geometry issues or edges going out of focus," explained Mark Capstick, general manager, Ikegami Electronics UK.
The FED is aimed at top-end, high quality monitoring as a replacement for Grade 1 monitors, and meets the requirements of the EBU and the Association of Radio Industries and Businesses. It uses more than 10,000 nanocone emitters to illuminate each pixel and promises to be low on power consumption and relatively thin. The new monitor is not expected to be available until late next year, when it is likely to be between 24 and 26 inches in size. Ikegami needs to establish a suitable manufacturing facility in the interim.
Source: IBC Daily
Sunday, September 21, 2008
Labels: Displays and Projection
Nvidia wants to reinvigorate the 3D stereoscopic market by developing its own glasses hardware and driver software, which they hope will avoid the pitfalls of previous efforts. Maximum PC talked to Andrew Fear, the product manager of GeForce Stereoscopic 3D, to get the full scoop on why this isn’t going to be just another fad.
How would you summarize stereoscopic 3D for someone who's never used it?
NVIDIA GeForce stereoscopic 3D technology is an NVIDIA software and hardware solution which takes standard Microsoft DirectX games and converts them to stereoscopic 3D for an incredibly immersive gaming experience. Now all of your games are have depth information that goes into and comes out of your monitor. One of the best things about this from a gamer’s standpoint is that we are using the standard 3D games they are playing -- we are not requiring special versions of games to get this experience.
How does it work?
The NVIDIA GeForce Stereoscopic 3D driver works at the lowest level by taking 3D game data and rendering each scene twice – once for the left eye and once for the right eye. Each eye image is offset from each other for the correct viewing. The GPU then sends this data to a 3D Ready display. These displays show the left eye view for even frames (0, 2, 4, etc) and the right eye view for odd frames (1, 3, 5, etc). NVIDIA 3D glasses then synchronize back to the 3D Ready display and present slightly different images to each eye resulting in the illusion of depth and an incredibly immersive experience for games.
What software and hardware is needed?
You’ll need a PC with the following:
- An NVIDIA GeForce 8800 GT GPU or better.
- Windows Vista 32-bit (64-bit support coming soon).
- Standard Microsoft DirectX game that NVIDIA has preconfigured in our driver (to date NVIDIA has preconfigured over 350+ games).
- A supported 3D Ready display. To date we have announced support for ViewSonic pure 120 Hz LCDs and Mitsubishi DLP HDTVs.
- NVIDIA stereoscopic 3D active shutter glasses (coming soon).
How does the current generation of stereoscopic 3D tech differ from what gamers saw five years ago?
The new software technology we are working on has come a long way. Today our driver supports NVIDIA SLI, GeForce 8 series, Windows Vista, and DirectX 10. So it’s a cutting edge, terrific gaming platform to start with.
Our driver now supports the latest Zalman Trimon 3D Ready displays and will add support for new 3D Ready displays (ViewSonic and Mitsubishi) working with our new 3D glasses laster this year. The underlying technology works the same, but the experience has improved with support for more games, more graphics cards, and new hardware.
How does game integration work? Will patches or special game profiles be required? Is it compatible with both Direct3D and OpenGL?
NVIDIA GeForce stereoscopic 3D technology was designed to work with virtually all DirectX 7, 8, 9, and 10 games. The driver automatically converts standard 3D games to work with 3D Ready displays. There is no need for patches. In fact, more than 350 games work well with our technology out of the box. NVIDIA is also working with game developers to ensure that new titles work properly with our stereoscopic 3D technology out of the box. Right now, we do not have OpenGL support but will be working to release it soon.
We saw demos of the technology running in a real-time strategy, shooter, and racing game. How does the technology know how to differentiate between game genres to ensure that 3D looks right?
NVIDIA’s software team analyzes games and correctly configures the settings based upon the type of game you are using. So the great thing for consumers is that we’ve done all the work for you, so you can get gaming in minutes.
How will users be able to calibrate 3D?
One of the biggest limiting factors in previous solutions for gamers was that they required meticulous calibrating when setting up your display and glasses. With many advances in technology, a lot of that setup can be done automatically now since we can detect the displays, glasses, and games. That being said, end users still have full control over the amount of 3D depth (sometimes called eye separation) for all of their games. So end users can configure these settings directly in a software control panel. In our new solution launching later this year, we will also provide a scroll wheel on the back of the wireless emitter that lets you quickly “dial in” the level of 3D depth to your taste.
Will this work with someone who wears glasses or contact lenses?
Users who wear glasses and contact lens should have no problems with our 3D glasses. In fact, our glasses were designed from day one to be easily worn over most types of glasses frames, so you can comfortably wear both. In addition, we will provide different nose piece attachments when the glasses ship so you can select the nose piece that’s most comfortable for you. We tested our design among scores of eyeglass users leading up to and including NVISION, and every glasses wearer had no trouble wearing our 3D glasses over their prescription ones. Contact lens users won’t be affected and can wear our 3D glasses with no problems.
Is the effect nauseating after prolonged usage?
The experience of playing a game in 3D can be so convincing that those new to it may feel slightly disoriented at first. This varies considerably, because everyone is different. Some people get car sickness and others don’t. It’s the same thing with 3D – some people can feel disoriented while others aren’t affected. Typically most people have a negative experience with stereoscopic 3D gaming for two reasons: low refresh and too much 3D depth.
Our new 3D glasses solve the problem of low refresh rate because they are designed to work with LCDs and DLP HDTVs which operate at a higher refresh rate. Most gamers are extremely comfortable at these settings.
Too much 3D depth can also cause eyestrain since your brain needs time to adjust to dimensionalized data on your monitor. If you think about it, all of your life your brain has been trained that it only has to focus at the depth of your monitor, even when you are playing 3D games. However in the real-world, your brain in trained to change its focus on objects at different depths all of the time and you do not experience any problems.
So if you think about it, we are just retraining your brain to now be able to focus on your monitor knowing that objects go into and come out of the screen. To help ease this transition for users, our software always starts off with a lower depth amount. We tested this level with end users and found it was a good value for people experiencing stereoscopic 3D for the first time. We also found that most people’s eyes adjust fairly quickly after about fifteen minutes and generally want to turn up the 3D depth after that.
We’ve done extensive testing with our new glasses and 3D Ready displays, and we’ve found that experienced users can easily play a game for 4 hours or more without feeling eyestrain or disorientation.
What are the technological limitations of stereoscopic tech? Will more than one person be able to see the game in 3D at once?
Absolutely, that’s one of the things that’s so cool about it. If you’ve gone to a 3D movie recently, you’ve seen how the audience reacts when characters and objects appear to jump out of the screen. You can also enjoy the same sense of amazement playing games with your friends or family. It definitely makes games more interesting to watch. At NVISION, we demonstrated stereoscopic 3D gaming on Mitsubishi DLP HDTVs and we had more than eight people using our 3D glasses at once, all watching the same game. Our 3D glasses use a wirelress IR receive to synchronize back to the monitor and PC, so the amount of users that can game at once is literally how many people can you fit in your living room!
What kind of GPU processing power is required to render stereoscopic images? How are framerates affected?
We recommend a GeForce 8800 GT-level GPU or faster for a good stereoscopic 3D experience because our 3D technology must calculate two versions of each frame to render it correctly. For this reason, there will be some performance impact running in a game in stereoscopic 3D mode. With a suitable GPU, the gameplay experience is still fast and immersive.
Can you talk about the shutter glasses hardware that NVIDIA is working on and planning to bring to market? Release date and price range?
These glasses are a new design from NVIDIA: they operate wirelessly to an IR transmitter that connects to the back of your PC via USB. They have a rechargeable battery that lasts about 40 hours on a single charge, and they turn off after 10 minutes of non-use to save battery charge. A small indicator light will blink red when the battery needs to be recharged. Simply connect it to the (included) USB cable to recharge.
We expect to release the glasses in a package with the emitter by the end of this year. The retail price hasn’t been set yet.
Is NVIDIA working with any publishers or developers to promote 3D stereoscopic technology?
Absolutely. We have shown the glasses to the majority of PC game publishers and developers. They love the effect, and they like the fact that they don’t need to do anything special to support it. Most developers just say “When can I get one?” That being said, game developers can always work with us to ensure that game is optimized out of the box and delivers an even more immersive experience.
Is this a technology that’s being targeted for the living room or more for desktop gaming?
That’s a good question. From our standpoint, we’ll feel we succeeded if users can have a great 3D experience at their PC or in the living room. It probably depends on the room and the monitor, because we’ve noticed that people like to be fairly close to the ViewSonic 22-inch desktop LCD, and they like to be about 8 feet away from the Mitsubishi 73-inch Diamond Vision DLP.
What are some other applications of 3D stereoscopic tech outside of games?
Simulations are an obvious area of great potential. At NVISION an engineer told us how he developed astronaut training simulations for NASA which cost hundreds of thousands of dollars and produced a similar effect. Good stereoscopic 3D technology can be used for training pilots, doctors, technicians, and soldiers.
One of the other areas we are looking at for consumers is that wide array of 3D applications are out, such as Google Earth, Piclens, and Microsoft Photosynth. All of these applications utilize the processing power of a GPU to render their effects in 3D. Since our GPU can access that data, we can create a stereoscopic view of it and completely immerse you in it.
Home movies are also moving towards 3D. Consumers are eager to enjoy high-fidelity, immersive experiences in their home after experiencing it in the theater. If they can have immersive experiences at an affordable price, you’ll see nothing but smiles under those 3D glasses. 3D movies for the home is not quite ready yet, but we are working with the industry to help enable a new standard for the home.
We also chatted with Duane Brozek of Viewsonic to get a panel-maker’s perspective of 3D Stereoscopic tech:
What technologies do display panels need to have for 3D to be supported?
There are currently several types of 3D technologies available in the market. The two most common types being Stereoscopic with active shutter glasses, and autostereoscopic type without glasses. The autostereoscopic technologies include Barrier Type, Directional BLU, and Lenticular type LCD panels.
ViewSonic feels that clearly the best available solution in terms of performance, manufacturing complexity, and cost is the stereoscopic technology which we recently announced in conjunction with NVIDIA. In terms of panel design, the only requirements are the ability to run in native mode with 120 Hz data content input, and the ability to support a fast gray-to-gray response time in less than 1/120 of a frame (this equates to a gray-to-gray response time below 8ms).
While autostereoscopic technologies are improving, they are panel structure dependent and have a high cost of manufacturing and software development. Additionally, they can demonstrate a number of limitations in terms of performance criteria such as brightness, resolution, and viewing position. We don’t believe that a good quality stereoscopic technology without glasses is cost effective either now or in the foreseeable future.
Are there additional benefits to 120 Hz LCD panels?
There are three additional benefits of 120 Hz LCD technology for consumers:
- Enables full resolution stereoscopic viewing with active shutter glasses technology.
- Enables a wider viewing angle than current autostereoscopic solutions with active shutter glasses.
- 120 Hz LCDs are also terrific for gamers when not playing stereoscopic 3D games, because the higher refresh rate means you can display more frames per second running on NVIDIA GeForce GPUs.
How much more expensive will 3D-supported panels cost over regular displays?
Depending on the type of 3D implementation chosen, the additional cost on the monitor side could range anywhere from $100 to well over a thousand dollars. ViewSonic is targeting to launch our first “pure” 120Hz / 3D desktop product at an end user price range reflecting a premium at the lower end of that scale. For the performance improvement we will deliver, we believe that gamers, graphics professionals and enthusiasts will be excited to put one on their desktop. The Controller and Glasses will be sold separately.
What kind of market penetration and adoption rate do you expect for 3D-capable panels in the next couple years? What will be the biggest determining factor for consumers to get on board with this tech?
We believe that 3D-capable LCD monitors will certainly be one of the fastest growing segments over the next several years. We are essentially starting from a base of zero though, and do not expect to see market share greater than 5 percent within the period. However, the products that we introduce now will be laying the groundwork for the next generation of 3D displays, and providing a framework for the continued development of new 3D content. These new “pure 120Hz” monitors not only provide a crisp, blur-free 2D experience for a myriad of consumer and business applications, but also a truly immersive gaming experience that we believe will revolutionize the desktop and generate considerable demand and sales.
By Norman Chan, Maximum PC