Choice in 3-D Digital Cinema

"Digital 3-D is the buzz today in cinema. Innovation in both content production and in presentation has elevated stereoscopic 3-D to a quality of audience experience never before possible. On the content production side, live stereoscopic capture and editing techniques make possible productions like the upcoming U2 3-D movie. Improvements in computer graphics rendering, such as that developed by Sony ImageWorks, has led to visually stunning 3-D imagery seen in Beowulf. Conversion of live 2-D productions into 3-D, perhaps best known through In-Three's demonstration clips of the Star Wars series, will make possible the re-release of popular 2-D blockbuster movies in 3-D.

Advances in the presentation of stereoscopic images through digital projection, however, have made the resurgence of 3-D possible. The Texas Instruments' DLP Cinema technology makes it possible to project stereoscopic 3-D images with a single projector and with a quality level not possible with 35mm film. In contrast, LCOS projectors, such as the Sony SRX-R220/R210 cinema projectors now in trial installations, require two projectors to present 3-D images. By outfitting either the DLP or LCOS projector with a 3-D presentation kit and glasses (available from a few companies), the exhibitor has the opportunity to present 3-D movies worthy of a premium charge to audiences.

All digital 3-D content is distributed in a 48 frame-per-second (fps) format. In the stereoscopic format, images for the left eye are distributed at 24 fps, and similarly, 24 fps for the right eye images. (The sum of these image rates equaling 48 fps.) However, there are other factors which affect the single-inventory nature of 3-D distributions, leading to disparate distribution methods. To overcome this, Digital Cinema Initiatives (DCI), a coalition of the six major motion picture studios, announced in April its draft specification for 3-D content distribution, stating that all 3-D presentation methods must utilize a common distribution format. While this ideal has not yet been realized in existing 3-D systems, the goal is technologically feasible. To accelerate progress, the Society of Motion Picture and Television Engineers (SMPTE) is now standardizing a single 3-D distribution format. As importantly, the various providers of presentation systems either now or will in not-too-distant future offer systems that support single inventory 3-D content.

DLP technology can project 3-D images with a single projector by presenting the stereoscopic left/right image pairs sequentially. This means that a left image is presented, and then a right image is presented, and never will both a left and a right image appear on the screen at the same time. However, presenting left/right images to the audience at a 48 fps rate is less than ideal as the sequential nature of the images are perceivable and distracting. To overcome this, sequential projection requires that the stereoscopic pair of images are "flashed" on screen. This involves, within the time frame of 1/24th of second, the repetition of a left/right sequence three times before presenting the next left/right sequence. This process is called "triple flash." With triple flash, the rate in which images are presented to the audience is a speedy 3 x 48 fps, or 144 fps. The triple flash rate is a property of the projector, and is the flash rate employed with all add-on technologies for presenting 3-D images in the theatre.

As pointed out, several add-on kits and glasses for 3-D presentation are now available. These can be categorized by technique: polarization, spectral division, and shutter glasses. While all three techniques can be used with DLP Cinema projectors, only polarization and spectral division work with dual-projection systems. Where the 3-D add-on technologies differ is in the method employed to direct left images to left eyes and right image to right eyes.

Polarization is most widely used technique today. It involves optically encoding each left image with a particular direction of light polarity, and each right image with an opposite direction of light polarity. In the Real D system, the encoding takes place at the projector using an electronically controlled polarizer, which Real D calls the "Zscreen." Images are decoded when the audience wears complimentary decoding polarized glasses. To allow head movement without upsetting the decoding quality of the glasses, Real D uses only circularly polarized filters in its system. Polarization alone, however, does not offer sufficient protection from crosstalk, also referred to as extinction ratio, with stereoscopic images. The audience experiences such crosstalk as a ghost in the motion picture. To enhance the ability of its polarization method to reject ghosting, Real D employs a "ghost busting" technique, which requires pre-processing of the images prior to projection. In its early systems, Real D's ghost-busting is applied prior to distribution. In future systems, ghost-busting will be applied in real-time by means of a processing box in the playback system.

Spectral division technology optically encodes left and right images by projecting each with a differently filtered spectrum of light. In the Infitec spectral division technique licensed by Dolby Laboratories, the light is filtered such that the left spectrum appears as white light (or near-white light), as does the right spectrum. In this way, this technique is importantly differentiated from the older, much lower quality, anaglyph method of using red filters for one eye and blue filters for the other. In Dolby's implementation, the light path in the projector is modified with a filter wheel to achieve spectral division of the stereoscopic images. Prior to projection, some color-balancing is applied to the image signal inside Dolby's digital cinema server. Complementary spectral division glasses are worn by audience members for decoding the images so that left eye images are seen only by the left eye, and right eye images are seen by only the right eye. To accomplish this, Dolby's glasses employ some 50 layers of thin-film coatings to create the appropriate optical interference filters. As interference filters require the light to pass through at a 90-degree angle, the glasses are curved to allow for eye movement without losing decoding quality at the viewer.

Shutter glasses, promoted for cinema use by Nuvision, take direct advantage of the sequential nature of the projected images. No special optical encoding is required with shutter glasses. To "decode" the sequential images, the audience wears glasses that allow only one eye to see the screen at any one time. By synchronizing the shutter-nature of the glasses with the flash rate of the projector, the audience correctly sees only left images in the left eye, and right images in the right eye. Synchronization of glasses takes place through infrared transmission inside the auditorium. To achieve the shutter action, the glasses must have battery-powered electronic circuitry in them that drives the liquid crystal (LCD) lenses.

The three methods described have important points of comparison. To preserve the polarized nature of the projected light in the auditorium, the polarization method requires the use of a silver screen. In contrast, the other methods, both spectral division and shutter glasses, work well with a normal mat white projection screen. Polarization, however, allows the use of very-low-cost glasses, such that they can be given away to the audience members. Both spectral division and the shutter method require expensive glasses that must be recycled (and thus regularly washed) for the method to be economical.

Manipulating projected light for the presentation of 3-D images has its price: all methods severely reduce the amount of light that reaches the eyes of the audience, typically around 85-88%. To the exhibitor, the light level determines the maximum screen size possible to present an acceptable 3-D image. Fortunately, it is acceptable to project 3-D images at significantly lower light levels than 2-D images, typically around 4 foot-lambert (ft-L), versus the standard 14 ft-L for 2-D. To compensate for low light levels, a high gain screen can be employed. Silver screens, of course, are very high in gain. It's understandable, then, that the polarization method, which requires the use of silver screens, adapts well to large screen applications, as indicated by IMAX's choice to use polarization for its 3-D presentation systems.

There is good reason for the recent resurgence in cinematic 3-D, spurred on by recent advances in both content production and in digital cinema presentation. For exhibitors, excellent choices exist with 3-D add-on technologies for 2-D digital cinema systems, all revealing high quality 3-D images to the audience. However, each 3-D add-on system presents a unique set of tradeoffs, clearly leaving the choice of system to exhibitor preference."

By Michael Karagosian, MKPE Consulting

Thomson Demonstrates 4K Real-Time Data Streaming of Uncompressed Digital Content over IP Network

"Thomson today successfully demonstrated the real-time streaming of an uncompressed digital 4K movie over a 10Gbps optical fibre IP network operated by LambdaNet Communications Deutschland AG. For the first time, a production-quality digital movie was transmitted over a looped 2,000 km IP network, interconnecting German cities Berlin, Dresden, Erfurt, Hannover, Leipzig and Magdeburg. Transport and playback in real-time took place at Thomson’s Corporate Research Center in Hannover.

LambdaNet provided “LambdaNet Data Link”, a business solution that allows customers to utilize simple and cost-efficient Ethernet technology for both LAN and WAN networking.

Thomson’s newly developed high-performance transport protocol allows a comprehensive utilization of the 10Gbps network bandwidth, eliminating adverse effects introduced by long distance connections. Outperforming other solutions in the marketplace, such as Transmission Control Protocol (TCP) and User Datagram Protocol (UDP), the new Thomson protocol is fully backwards compatible with existing IP network infrastructure.

Anticipating the needs of the increasingly globalized digital content production industry, Thomson has been conducting research for almost two years on high speed real-time data transfer technology to enable efficient management of production-quality digital content over long distances. Initial achievements on a simulated network were presented at NAB 2007. Now, by partnering with LambdaNet, Thomson was able, for the first time, to showcase the results of their research on a commercial European network."

Source: Yahoo

Editopia Releases Groundbreaking Encoding Solution

Editopia, an open source technology company for the online video market, today announces the release of NCode, a server-side video encoding solution that enables web developers to easily add video to flash encoding to their websites. NCode will encode virtually any video format to FLV. Its open source and free to use licensed under the GPL. An enterprise edition will soon be available offering support, indemnification and rights for fee based codecs.

NCode supports multiple codecs including H.264 and is entirely cross platform and scaleable with an image processing pipeline making it potentially the most powerful server-side encoding application available to web developers. NCode currently supports both PHP & Java programming environments.

Codec Support for Flash:
- H.263
- H.264
- On2 VP6

Video Formats Supported:
- 3GP
- MOV
- WMV: 7&8
- MP4
- MPEG1
- MPEG2: PAL & NTSC
- DV: PAL & NTSC
- AVI

Audio Codecs Supported:
- AAC
- MP3

Server Platforms:
- RedHat
- Fedora
- Ubuntu
- Windows
- Solaris

AMPAS: Archive Before it's Too Late

"How many of you have favorite movies or music on 8-track, VHS or Betamax tapes but no longer have a player for these formats? How many of the VHS tapes that contain precious family memories still will be accessible after another 20 or 30 years? How many of your digital photographs are a hard-drive crash away from being lost forever?

The Science and Technology Council of the Academy of Motion Picture Arts and Sciences asked similar questions about our film history as an increasing number of features are being photographed, posted and mastered in new and varying digital formats.

Having conducted an extensive study of the topic, the committee has released a 74-page report titled "The Digital Dilemma: Strategic Issues in Archiving and Accessing Digital Motion Picture Materials."

"We are already heading down this digital road ... and there is no long-term guaranteed access to what is being created," said Milt Shefter, who is the project leader on the AMPAS Science and Technology Council's digital motion picture archival project. "We need to understand what the consequences are and start planning now while we still have an analog backup system available." In fact, the council already has identified instances where digital content could not be accessed after only 18 months.

If this issue is not addressed, Shefter said: "From a studio standpoint, if you don't have guaranteed access, you lose the potential future revenue (from a library). From a culture standpoint, you lose the ability to look back at what was being done in a period of time."

Shefter noted that a requirement for any preservation system is that it must meet or exceed the performance characteristic benefits of the current analog photochemical film system. According to the report, these benefits include a worldwide standard; guaranteed long-terms access (100-year minimum) with no loss in quality; the ability to create duplicate masters to fulfill future (and unknown) distribution needs and opportunities; and immunity from escalating financial investment.

"There's nothing in the digital world that comes close to this at this point," he said. "The problem is uniform. Everybody has it across the business spectrum, and there is no solution."

Economics also is outlined in the report, which suggests that the annual cost of preserving film archival master material is $1,059 per title, and the cost of preserving a 4K digital master is $12,514. The report states, "The annual preservation costs for a complete set of digital motion picture source materials also are substantially higher than those for film, and all digital asset storage requires significant and perpetual spending to maintain accessibility."

Said Andy Maltz, director of the Science and Technology Council: "This is one of the most important issues the Technology Council is looking at. It's defining a lot of our work for the next two to three years, or longer."

By Carolyn Giardina, The Hollywood Reporter

3-D Future Still Fuzzy

"Stakeholders in the 3-D arena believe the upcoming concert films "U2 3D" and Disney's 3-D "Hannah Montana/Miley Cyrus: Best of Both Worlds Concert Tour" signal that future content can be much more than theatrical features.

But all is not rosy in the 3-D universe: Some execs who attended a panel discussion about 3-D on Tuesday at Digital Hollywood at the Renaissance Hollywood Hotel have concerns about production.

"I don't think anyone realizes yet what (the potential of 3-D concert films)," said Josh Greer, president of 3-D provider Real D, speaking of the "Hannah Montana" concert film and the frenzy behind the live performance ticket sales.

"The idea that you go from a venue where you have 5,000 or 7,000 seats to being able to offer 1 million tickets -- I think this is going to have a profound effect on the type of content in theaters. That will also relieve some of the pressure for 3-D content."

But a potential holdup to increased live-action 3-D production is the commercial availability of 3-D cameras, said Bob Mayson, Kodak's GM of digital motion imaging and vp entertainment imaging.

"Until the day we see hundreds of commercially available rental cameras, it's going to be tough to drive the large volume of 3-D live action projects," he said. "You can get to 10-12 films a year, but to get beyond that is going to be tough. If we want to get to sports and concerts, we need to see commercially available 3-D broadcast cameras."

Shifting to 3-D in the home, 3Ality Digital will be shooting the "Live With Regis & Kelly" 3-D Halloween show today as an anaglyph NTSC broadcast. 3Ality COO and CTO Howard Postley said that this will be a four-camera shoot, making a total of eight cameras. The 3-D broadcast will air with the involvement of Walgreens, where viewers can pick up a pair of 3-D glasses to view the episode."

By Carolyn Giardina, The Hollywood Reporter

Interview of Tim Partridge, EVP Dolby Laboratories

"What is Dolby’s working relationship with Infitec? What roles did each of your companies have in the development of this Dolby 3D solution?
Our basic requirements for the system, based on feedback from theater owners around the world, were that it should work with the regular white screens that are currently installed, and that the glasses should not require batteries. We discovered at Infitec some core technology that we thought would enable us to meet these requirements. So we licensed the Infitec IP (Intellectual Property) for use in cinemas, and using this idea, developed at Dolby the necessary components of software and hardware for the theater, and of course, the glasses.

What were the biggest technical challenges you wanted to overcome in the development of your stereoscopic 3D (S-3D) solution?
One big challenge with any 3D system is the amount of light that is lost as you go through the filters at the projector and then through the glasses. This then limits the size of the screen you can use in the theater so we are always looking for ways to get more light.

One way we do this is by putting the filter inside the projector in between the lamp and the sensitive picture forming parts of the digital projector. The filter reduces the heat from the lamp that gets to those parts and therefore allows for a bigger lamp giving more light. We also wanted to avoid putting a moving filter in the path of the image since that inevitably has a negative impact on the final picture quality, another reason why we put the filter inside the projector.

The biggest challenge for cinemas though was the need to replace their screen with a silver screen for the other 3D systems. Not so much a technical challenge, but a very practical one since the silver screen is expensive and the picture quality provided by a silver screen is not as good as that with a white one. So we are very pleased we have been able to provide them with a 3D system that allows them to keep their white screen.

The biggest technical challenge for us in developing the system was being able to make glasses with the exact filters we needed for each eye, and manufacture them in high volumes. But we did it working with several specialist vendors and the results are stunning.

In your experience, what are the leading objections by exhibitors or movie theaters to adopt S-3D movie hardware/projectors? How are you acknowledging and circumventing those objections?
So far we have had an overwhelmingly positive reaction from exhibitors to our system. Compared with the other offerings out there, they of course like not having to change their screen, they love our quality on the screen, they appreciate the flexibility of being able to move the 3D movie from one screen to another easily, and also being able to switch quite easily from 3D to 2D on the same screen.

They also like our business model since it is the same way we have done business with them for over 30 years.

I would say the only questions we get are around the glasses - which is where the real technology lies. Since these are not $1 glasses, the exhibitor will be reusing them many times and cleaning them between each use. Once we explain how easily this can be done, and also that by reusing them many times they have a much less expensive per use model, plus they are also being kinder to the environment by not throwing all that plastic away after each screening, they see all the benefits of the Dolby 3D system.

Tell us about the technology. A colleague told me that your solution is "anaglyph on steroids". Can you explain how the technology works?
It is true that we use color to separate the left image from the right one, but that is where the similarity with anaglyph techniques ends. With anaglyph you had one color per eye, with Dolby 3D you have every color in each eye - and this leads to superb color fidelity, something that everyone who sees it instantly comments on.

How it works is that we choose a red, a green, and a blue for the left eye, and a slightly different red, green, and blue for the right eye. Once you have R, G and B you can create all the colors of the spectrum in each eye.

If a viewer watches a movie in 3D and blinks one eye at a time, will there be any ghosting, and will the colors be identical between the eyes?
One advantage of our system is that the crosstalk, or ghosting, from one eye to the other is particularly low which is why we have such sharp and beautiful images on the screen. The difference in color from one eye to the other is so small (that is why the filters in the glasses have to be so precise) that it would be hard to notice, and when both eyes are open (as is usually the case!) the brain compensates for that difference.

I understand that the glasses used for your solution are $50 a piece. Can you explain what makes these glasses special and why they cost a lot more than traditional polarized or anaglyph lenses?
It comes down to the filters. They are extremely precise which gives us superior crosstalk cancellation (i.e. the right eye image doesn’t get through the left eye filters). To do this, we have to lay down fifty layers of filters on each lens. Plus we also wanted a curved lens design to improve the viewing experience even further, and laying down fifty layers with extreme accuracy onto curved lenses is no small feat!

We also make the lenses scratch resistant and very tough so they can withstand hundreds of uses, so the cost per use comes down to just a few cents.

I wear glasses, and I can tell you first hand that they get dirty pretty quick. In a room filled with buttery popcorn, even more so. How do you clean these glasses and how often?
The glasses will be cleaned after each and every screening so they will always look perfectly clear.

What is the DCI (Digital Cinema Initiatives) standard, and how does it relate to Dolby Laboratories?
The DCI standard is a set of technical specifications written to help multiple manufacturers design and build digital cinema equipment to a common standard so that movie files packaged in Hollywood, or anywhere else, can be guaranteed to play on all these pieces of equipment in any theater.

Not only does it specify file formats and interconnects to enable this interoperability, but it also specifies high degrees of security both in hardware and software to protect the digital content files from piracy. Dolby has designed a Digital Cinema server to accept, manage, decode and play out these files, and as such, it has been designed to the DCI specification to ensure we fully meet the requirements of the studios and the exhibitors. Some of these specifications have not been quite finalized yet though, which is one of the issues we are still working through.

How much money should an exhibitor expect to spend to upgrade their equipment to Dolby 3D?
The hardware is around $20k, and each screen will need 2 pairs of glasses per seat to make sure there is always a clean pair available.

How is Dolby Laboratories positioning their offering to help justify the expense to exhibitors? What ideas have been brought to the table?
Exhibitors told us they would just like to buy the equipment up front and outright without any ongoing commitments. This is how we have always done business with them so we were happy to oblige. We are open to other models, but this is what they seem to prefer right now.

In terms of justifying the expense, exhibitors trust that Dolby equipment lasts a long time, and with the proven ability to charge a premium on each ticket for 3D, and the number of 3D films in the line-up for the next few years, I don’t think they have a problem justifying the investment.

I understand the movies have an invisible imprint that shows up on bootlegged movie copies, and this can trace a movie right down to the theater and the time it was shown at. There is a lot of pressure on exhibitors to cut down on movie piracy because of this. How does S-3D help ease the burden on movie theaters?
There are many security features built into digital cinema to combat piracy but preventing the camcorder from capturing the image on the screen is an issue that technology has yet to solve. 3D, however, is inherently protected against the camcorder copy since the image on the screen is a double image (one for the left eye and one for the right eye), and would be unbearable to watch on a pirated copy."

By Neil Schneider, Meant To Be Seen

'U2' on Tour in 3-D

"National Geographic Cinema Ventures has picked up upcoming concert docu "U2 3D," which it will distribute both domestically and internationally.

"U2 3D" will be released in late January in 3-D only.

Pic was lensed in South America during the band's "Vertigo" tour entirely in digital 3-D. In the U.S., theater owners were recently treated to an extended clip of the film at exhib confab ShowEast.

Concert pic was produced by 3ality Digital and directed by Catherine Owens and Mark Pellington. Owens has been U2's visual content director for more than 15 years, while Pellington directed the band's "One" video.

"Digital 3-D is a new cinema medium that truly allows moviegoers to immerse themselves in the experience, energy and emotion of being in a prime seat at a U2 concert," 3ality Digital CEO Sandy Climan said.

National Geographic Cinema Ventures prexy Lisa Truitt said the release of "U2 3D" is a natural expansion of her division's growing presence in theatrical distribution.

Producers are 3ality Digital's Jon Shapiro, Peter Shapiro and John Modell, as well as Owens. Climan, Michael Peyser and David Modell exec produced."

By Pamela McClintock, Variety

Akamai Unveils Unique View of the High-Definition Internet

"Akamai Technologies, the leading global service provider for accelerating content and applications online, announced The HD Web, a ‘proof-of-concept' portal designed to showcase the experience consumers can have with high definition content online.

The website showcases content from a variety of industries including music, movies, professional sports, games and news. Akamai customers are now delivering a consistent, high-definition video experience on Akamai's uniquely distributed edge delivery platform that is specifically tuned for optimal delivery of high-definition (HD) files online. As an industry leader and pioneer in online video, Akamai has raised the bar for what a service provider must offer companies in order to deliver this content effectively and efficiently online today.

Companies providing HD content for the initiative include Apple, BBC Motion Gallery, CBS, Gannett, MTV Networks, NBA and more. The proof-of-concept portal will serve as a temporary Internet Programming Guide to HD video on the web and provide access to a complete HD experience. These companies understand the value that HD content brings their brand and are ahead of the curve by offering HD programming online.

A critical factor to enable high bit rate delivery of very large HD files is the proximity of the end-user to the server sending the file. As the distance from the server becomes greater, throughput dramatically decreases. Even a seemingly small distance can result in lost throughput due to lower throughput, higher packet loss, and increased latency. The more latency, the longer it takes to download the file, which can interrupt the viewing experience and result in a poor end-user experience. Akamai works closely with leading broadband ISPs to deploy servers directly in those networks to ensure that content is served closest to the end user for a superior HD experience. Compared with other centralized models, the results that Akamai's network offers are unmatched on a global scale.

Akamai recently outlined technical criteria for delivering HD content on the Internet. Akamai has architected its platform to comply with the following technical criteria that content owners must leverage to successfully enable an HD Web. Akamai believes that it is the first and only platform to meet these technical requirements which include offering:

- Technology and an operational model to operate serving devices in the largest high-throughput networks around the world (servers need to be physically in the networks, as that is where the capacity lies).

- Established relationships with the largest high throughput networks.

- Support for delivery, storage, and management of files greater than 2 Gigabytes.

- Support of VC-1 and MPEG-4 video standards, achieving visual parity with other broadcast video networks.

- Support for files with resolutions of 720p, 1080i and 1080p.

- Client-side technology that is deeply integrated into its delivery system to be deployed as appropriate."

Source: Akamai

Watermarking and Fingerprinting

"Watermarking and fingerprinting are two forms of technology known generically as content identification. Watermarking works by embedding data into digital images, audio, or video in such a way that the data is very difficult to remove and the effect on a user's perception of the content is (usually) nonexistent. The data embedded in a watermark is often the identity of the content, though it could also include the identity of a user or device that downloaded it, or of a retailer that sold it.

Fingerprinting is a set of techniques for analyzing content, reducing its unique characteristics to a set of one or more numbers that serve as "fingerprints," and looking those fingerprints up in a database to determine the identity of the content.

Interest in both of these techniques has been growing rapidly in recent months. They are passive, meaning that their use in identifying and tracking content do not (by themselves) interfere with a user's ability to play, copy, or send it. They complement or substitute for active content control techniques such as the encryption that is used in typical DRM technologies.

Practical applications of digital watermarking for tracking content usage have been in existence for roughly a decade, but during the first Internet bubble, watermarking vendors oversold the content industry on the technology as a panacea for Internet piracy. This resulted in a backlash against a set of techniques that, in retrospect, were fairly basic. But watermarking techniques have become much more sophisticated and useful recently, and a wider variety of participants in the content value chain have gotten involved. Many vendors are involved in the watermarking arena, including Digimarc, Philips, Thomson, Cinea, Verimatrix, Activated Content, USVO, and Bitmunk.

Fingerprinting is a more recent technology; it came about in the 2000-2001 timeframe and was proposed as a way to make the original Napster P2P network copyright-compliant. Now there are a handful of music services (e.g., iMesh/BearShare) that use audio fingerprinting, most typically to block uploads of copyrighted music tracks to P2P networks. Such networks are often licensed by the major music companies, indicating their increasing comfort level with the technology -- although no one believes that it works one hundred percent.

Audible Magic and Gracenote are two of the leading audio fingerprinting technology vendors. SNOCAP uses fingerprinting (from Gracenote) to power services like its ad-driven model with imeem. Fingerprinting is capable of supporting wide ranges of innovative content business models; as with watermarking, the surface has barely been scratched, and we'll see some very interesting developments in the near future.

More recent fingerprinting solutions focus on video content, which is more technologically challenging than audio. As we saw last week, Google unveiled a video fingerprinting scheme, which is turning out to be less sophisticated than third-party technologies that are being developed by vendors such as Audible Magic, Philips, Vobile, Zeitera, and others. Attributor has a variation on this theme: a fingerprinting scheme for text content, which is being used by some of the major news wire services to track placement (both licensed and unlicensed) of their news content on various websites.

Watermarking and fingerprinting are distinct yet synergistic technologies. Their importance in the world of digital content rights is growing rapidly; in time, they may become more important than encryption-based DRM technology in certain media market segments."

By Bill Rosenblatt, DRM Watch

SeeReal Shows an Holographic 3D Display Prototype

"During Flat Panel Display International (FPDI) 2007, most of the major players were showing at least one more-or-less conventional 3D display, but SeeReal was showing a technological demonstrator of a 21-inch holographic 3D display. SeeReal has quite a way to go before they reach their goal of a 42-inch full color holographic TV - the demonstrator was monochrome and presented wire-frame images - but, significantly, the company has developed a way of creating a real-time holographic TV that brings the computational overhead down to a manageable level. By the time of the next SID show in May, SeeReal may have something that begins to look like a full-color TV prototype."

Source: Display Daily

Thomson Teams with Pathfire & DG FastChannel to Offer Capture Service HD File-Based Distribution

"Thomson has developed a faster-than-real time direct file import option with Pathfire to increase the high-definition (HD) content management capability of the Grass Valley K2 Media Server platform from Thomson.

The new K2 “Capture Service” software application enables the fast and easy handling of HD programs as digital files and eliminates the need for any third-party intermediate solutions that can slow the file-handling process considerably—in many cases to much slower-than-real time—saving organizations significant time and money.

Available immediately, the K2 Capture Service option is the result of a close collaboration between Thomson and Pathfire engineers and requires a simple software upgrade to existing K2 systems. In addition, the Capture Service option will also support SD and HD material transfer from the DG FastChannel Spotbox to K2. No additional hardware is needed.

The new K2 Capture Service feature was recently field-tested by Tribune Broadcasting with nationally syndicated content. During the week of September 6th, Tribune went to air at their WGN, KTLA, WXIN, WPIX, WGNO, KHCW, KDAF, KTXL, WPHL WDCW, KRCW, WSFL, and KCPQ locations from a series of remotely located K2 servers, successfully using the new Capture Service feature to distribute the program as a series of HD files faster than the program has ever been distributed before.

With this new Capture Service capability, Grass Valley K2 users now have a highly efficient file transfer method that allows HD content to be uploaded to a K2 server timeline nearly instantaneously (5-10X faster than real time), without the need for external transcoding or file conversion. By comparison, existing file-transcoding techniques require the need for a separate server for HD transcoding and current speeds are on-average 2X slower than real time (making it nearly 10X slower than the new Capture Service option). With significant amounts of content needing to be uploaded every day, it could easily take operations departments longer than a single eight hour shift to upload and QC less than three hours of material.

The Grass Valley K2 Media Server platform from Thomson has been embraced by a wide variety of broadcasters and content providers, with more than 1,400 K2 systems now in operation around the world. All of these systems can be easily upgraded with the new Capture Service automatic file conversion option.

The Capture Service option is available immediately from Thomson, and is priced at $5,000."

Source: BroadcastBuyer

LeVar Burton Shoots First 4K Feature with DALSA Origin Camera

“Tempting Hyenas”, the first feature length film shot with the DALSA Origin 4K digital cinema camera, will undergo a full 4K DI at Post Logic Studios.

Post Logic’s Image Science Division provided on-set supervision during principal photography, which completed recently, working closely with the production team to oversee the transfer of digital assets and ensure that the 4K image quality was maintained throughout.

Post Logic’s 4K workflow, developed in cooperation with DALSA Digital Cinema, begins with the careful handling of assets at the end of each day of shooting. Data from the Origin 4K camera (up to two terabytes a day) was recorded onto a Codex unit and output as 2K ProRes files for viewing dailies and for creating an edit decision list (EDL) during online editing.

The files were offloaded daily from the Codex onto a Ciprico Media Vault for transporting back to Post Logic Studios in Hollywood. Upon arrival, the data was backed up onto 400GB LTO3 tapes, with all assets and metadata meticulously catalogued in a proprietary database. This process ensured that the production could shoot continuously without skipping a beat. With the LTO3 tapes serving as the 4K image master files, Post Logic Studios will match up the time codes with the EDL to create the final 4K product."

Source: BroadcastBuyer

SAMMA Systems Extends Range with MJPEG2K Player

"SAMMA Systems, the global leader in media migration, has extended its range of lossless compression products with the introduction of its new MJPEG2K Player. The player / decoder, developed to provide an extremely cost-effective method for monitoring and editing Motion JPEG2000 files, is making its official debut at SMPTE 2007.

The MJPEG2K Player card operates in conjunction with SAMMA Systems' recently launched, award-winning SAMMA Solo - the world's first real-time analog to digital lossless migration system. SAMMA Solo encodes mathematically lossless MJPEG2000 files in real-time in addition to other standard formats such as MPEG 2, H.264, Windows Media and Real Media. The MJPEG2K Player, the first low cost decoder for PCs, provides an extremely cost effective method for playing back the MJPEG2000 files for monitoring or loading into an editing platform. It provides SDI output for exceptional viewing quality with embedded 4-channel audio that is bit-for-bit identical to MJ2 output.

The SAMMA MJPEG2K functions as a decoder-only with SAMMA software for use in stand-alone viewing stations. It requires a user-supplied PC and monitor."

Source: SAMMA Systems

New IP Technology Moves HD Video Directly from Camcorder to Networked Storage

"In an effort to bypass the ingest process, two companies have demonstrated technology that enables direct digital video recording to networked storage from a 1394-equipped high-definition video camera.

The technology from Control Communications Systems and QVidium Technologies, called 1394 Gateways, enables direct recording to storage area network (SAN) or Network Attached Storage (NAS) directly from the camcorder.

The system uses QVidium Record Manager Software to take the video and audio coming from the camcorder, or other 1394-equipped system, and converts the feeds into standard IP packets. Once in the IP domain, the media can flow across networks that can span a studio or any location in the world.

The 1394 interface is very powerful but many uses of these HD cameras are limited by the short cable length, said Anthony Magliocco, Controlware’s director of sales and marketing. Now, he said, the camcorder’s images and sound can be instantaneously stored and edited anywhere in the world.

Using 1394 Gateways, camera operators can continue to operate their normal media in the camcorder while feeding the network drives simultaneously. Controlware, a company that specializes in loss-free terrestrial and satellite delivery of broadcast video, said it is offering both wired and wireless versions of the 1394 Gateways technology.

The technology offers support for all DVCPRO formats, including DVCPRO HD, DVCPRO50 and DVCPRO25, as well as 19.75 Mbps 720p and 25 Mbps 1080i HDV and a wide range of audio formats. It can display HD video to a PC’s VGA or DVI interface."

Source: StudioDaily

Making 3D Movies – Part II

"Yesterday, I attended the SMPTE pre-conference on 3D held in Brooklyn, New York. It was a nice complementary conference to the 3D Cinema session held at 3D BizEx last month in Burlingame, CA, as I learned more about the 3D cinema process and issues - and had a chance to have dinner with a who’s who of 3D experts.

Last month, Matt Brennesholtz wrote that 3D movie making requires a completely new way of approaching the movie. 3D specialists are needed, an intensive post-post process for 3D FX is required, and creating a 3D movie is expensive. Yesterday, I heard some of these themes amplified, but learned a lot more about the mistakes that can happen in 3D movie making and how some of these are fixed.

This list is extensive including left/right eye reversal, unintentional monoscopic frames, out of sequence stereo frames, etc. We heard a lot about choices made in dimensionalizing the movie. Keeping action within a limited volume is a good thing and violating this causes eyestrain. A scene transition where the focus depth changes even modestly is noticeable and can cause discomfort. If the content is dimensionalized for a 20-foot screen, it does not look good on a 60-foot screen, and visa versa. There is some concern that movies will need to be re-dimensionalized depending upon the screen size where they are shown, including the small screens for RPTVs and home theaters.

What was exceptional about the event was all of the 3D content that was shown. For example, some content was shown to demonstrate the effect of mistakes and errors. Other content demos were used to show how parameters could be adjusted to create better 3D effects that reduce eyestrain, augment or reduce the 3D effect and create 3D images that are, "painless and beautiful."

We saw clips from Beowulf, Chicken Little, Star Wars, U2 3D (gave me shivers), Open Season, Nightmare Before Christmas, Polar Express, as well as lots of other content shown by some of the producers and cinematographers in the room.

Many of these scenes were very good, some outstanding, but some also produced eyestrain - for some of the reasons explained by these very same experts. This is one of the most experienced and talented 3D content creators on the planet and the demos of 3D content were clearly not defect free. Granted, this could probably have been fixed with money and time, but it is also a reflection on the state of 3D content creation. With a lot of effort, outstanding 3D content can be made when the display format is known and the exhibition format determined. But this content is clearly not plug and play from one venue to another. This hurdle will take some time to overcome.

There were two other clear messages that came through from the conference:
- The 3D industry needs to train a lot more people in the art and science of 3D movie making.

- Better 3D workflow tools, particularly in 3D editing, are desperately needed.

Workflow tools are being developed in house by the major studios right now, as if the industry is "building the bridge as it walks across it." This is changing however, with more sophisticated tools expected from commercial suppliers over the next couple of years.

My colleague George Isaacs attended a very similar event in London on the same day and heard almost identical ideas and messages. One surprise from both shows, and one which generated some interesting dinner conversation, was subtitling for 3D movies. Warner Brothers is doing the international distribution of Beowulf and thinks subtitles are needed instead of dubbing. They did some experiments in various subtitling methods and showed to the audience in NYC using Polar Express as a test movie. The subtitles were shown over the content and in black bands below and above the movie. The text was also placed at the screen as well as in front and behind the screen. Most interestingly, the majority of viewers found that text above the movie was preferred and even a little in front of the screen seemed better. We may need to call these "supertitles," as this is the format Warner Brothers will use.

Overall, the mood of the crowd at the New York event was quite upbeat. They know they are at the beginning of a major transition in Hollywood and that challenges remain, but they are very bullish about the future. As one participant paraphrased an exchange that occurred between DreamWorks Animation CEO Jeffrey Katzenberg and National Association of Theater Owners (NATO) President and CEO John Fithian at ShowEast last week, "Is 2D digital cinema the dog and 3D the tail, or has 3D become the dog wagging the 2D digital cinema tail?" Guess what answer this partisan crowd favored?"

By Chris Chinnock, Display Daily

Doremi Cinema's DCP-2000 Server Receives FIPS Level 3 Certification

"Doremi Cinema announces that its DCP-2000 cinema server has officially received a Federal Information Processing Standards (FIPS) 140-2 Level 3 validation certificate.

FIPS Level 3 compliance provides the DCP-2000 with the highest level of protection required by the Digital Cinema Initiative (DCI) to secure the motion picture files used in the cinema server.

"Achieving FIPS certification brings the highest level of comprehensive security DCI compliance for our server," said Michael Archer VP of Sales at Doremi. "Our security solution also allows for maintenance to be performed at the theater level without special personnel, since our security design left maintenance items easily accessible."

The Doremi server underwent a comprehensive testing process by the accredited cryptographic module testing laboratory InfoGard. InfoGard submitted its test report to NIST, which issued the FIPS 140-2 Level 3 validation certificate. Doremi's NIST certification number is 850.

Doremi Cinema's DCP-2000 is by far the most installed cinema server in the world with over 4000 screens worldwide. Doremi's continued leadership in installations both underscores the reliability and consistency of the DCP-2000 server to provide both the highest quality JPEG2000 images and the highest levels of security sought by the major studios to protect their content."

Source: DCinemaToday

IMAX Sets Target Launch Date of Digital Projection Technology

"IMAX Corporation announced that it has moved up the launch date of its digital projection system in development to the second quarter of 2008 from its previously announced timeframe of the end of 2008 to mid 2009. The highly anticipated IMAX digital projection system will further enhance The IMAX Experience and help to drive profitability for studios, exhibitors and IMAX theatres by virtually eliminating the need for film prints, increasing program flexibility and ultimately increasing the number of movies shown on IMAX screens.

Under the current roll-out schedule, the company anticipates that three digital IMAX prototypes will be installed during the second quarter of 2008. Shortly thereafter, IMAX expects to install three additional prototypes. Once these prototypes meet performance specifications, IMAX expects to proceed with a full rollout during the second half of the third quarter and in the fourth quarter of 2008.

IMAX's digital projection system integrates a suite of proprietary IMAX intellectual properties with commercially available digital projection technology in a way that creates The IMAX Experience in a digital format. These properties, along with proprietary technology applied to the content, dramatically enhance the image fidelity, light output and contrast in both 2D and 3D to produce a stunningly crisp and bright image on the big IMAX screen and deliver the unparalleled image and sound quality that IMAX consumers have come to recognize and enjoy. In consumer testing conducted by Millward Brown, a respected market research firm, 98 percent of respondents who had seen IMAX before/were able to make the comparison, said that the prototype IMAX digital system fits with their expectations for the brand, and 46 percent said that the overall experience in the digital IMAX theatre was better than previous IMAX experiences.

The new system is configured for an IMAX MPX-style auditorium and is capable of showing Hollywood movies that have been digitally re-mastered using IMAX's proprietary DMR technology in both IMAX and IMAX 3D. The system will also be capable of showing original IMAX documentaries.

IMAX has already announced several multi-theatre agreements which are to include the new digital projection system. The company has also indicated that it intends to offer and sell upgrades to the new digital system to commercial operators who have IMAX MPX systems.

In North America, IMAX signed a joint venture agreement with Regal Cinemas for five systems, with three of the locations identified as direct to digital installs during the fourth quarter of 2008 and second quarter of 2009. Similarly, IMAX signed a joint venture agreement with Muvico Theaters for three systems, with the third targeted to be a digital install in Muvico's highly anticipated Xanadu complex in New Jersey. The Company also entered into a second multi-theatre agreement with Goodrich Quality Theaters, following the highly successful launch of the exhibitor's first two theatres. The new agreement includes a digital installation in a new multiplex planned for the fourth quarter of 2009.

Internationally, IMAX announced its largest ever multiple-theatre deal in Asia with China's Wanda Cinema Line Corporation. The agreement includes seven locations expected to utilize IMAX's digital projection system."

Source: IMAX

GDC Selects Thomson’s Forensic Watermarking for Digital Cinema Server Roll Out

"Thomson today announced that GDC, one of the leading solution providers for digital cinema, has selected NexGuard, Thomson’s comprehensive, state-of-the-art forensic tracking product line, for integration into 1,200 digital cinema servers.

GDC digital cinema servers will now embed NexGuard’s audio and visual forensic watermarking solution. NexGuard combats in-theatre piracy by offering forensic means to identify the date, time and location of illegal camcorder recordings.

NexGuard’s solution not only exceeds the Digital Cinema Initiative’s (DCI) specifications with resistance to illegal camcorder capture and compression, but also provides the ability to embed more than the required amount of critical identification information.

The NexGuard family of content security solutions has been solely designed to serve the media, entertainment and communication industries, and offers the most wide-ranging line of products to track and secure digital audiovisual content through production, post-production, distribution and exhibition."

Source: DCinemaToday

Microspace Beams DreamWorks’ “The Heartbreak Kid” to Theaters Nationwide

"Microspace Communications Corporation (Microspace), the leading distributor of digital cinema via satellite, today announced the satellite distribution of DreamWorks Pictures’ “The Heartbreak Kid” to theaters nationwide.

“Paramount (which is distributing “The Heartbreak Kid”) is at the forefront of changing the movie-going experience, leveraging digital delivery as a key element for the highest level of quality,” said Jim Tharp, President, Distribution, Paramount. “By tapping Microspace to deliver our films to theaters, we ensure that they will arrive securely and provide the true digital-quality discerning moviegoers demand.”

Microspace delivered “The Heartbreak Kid” to nine theaters starting October 5, 2007.

“Now is the time for both studios and exhibitors to utilize satellite distribution for the highest quality presentation the first time and every time,” said Joe Amor, general manager of Microspace. “Microspace has delivered 16 movies in the last year and a half and continues to work closely with the movie industry to ensure the most advanced satellite distribution capabilities.”

Microspace collaborates with studios, content preparation companies and exhibitors to utilize satellite distribution and its benefits. The proven workflow and electronic delivery of Microspace’s satellite distribution provides the industry with a turn-key solution for content delivery and minimizes the potential issues and costs associated with physical delivery. Through the use of two discrete satellite systems, movies and keys are delivered on-time, every-time at Microspace connected theatres."

Source: DCinemaToday

Real D Continues Global 3D Cinema Dominance

"In its largest international distribution partnership to date, Real D, the global leader in digital 3-D, has finalized an agreement with Odeon and UCI, the largest pan-European cinema exhibitor with more than 1600 screens, to install up to 500 Real D 3D cinema systems in theaters across Europe. The rollout begins immediately and continues over the next two years as digital cinema systems are deployed, bringing Real D 3D technology to new markets such as Spain and Italy while substantially increasing Real D’s footprint in the UK, Ireland, Germany, Austria, and Portugal.

Almost a third of Odeon and UCI’s cinema circuit will eventually be Real D enabled, bringing next-generation 3D cinema to millions of film fans. Some of these systems will be available in time for the release of the Warner Brothers and Robert Zemeckis film “Beowulf” in November and for the annual re-release of Tim Burton’s Disney classic “The Nightmare Before Christmas 3D”."

Source: DCinemaToday