Using Photographs to Enhance Videos of a Static Scene

We present a framework for automatically enhancing videos of a static scene using a few photographs of the same scene. For example, our system can transfer photographic qualities such as high resolution, high dynamic range and better lighting from the photographs to the video. Additionally, the user can quickly modify the video by editing only a few still images of the scene. Finally, our system allows a user to remove unwanted objects and camera shake from the video. These capabilities are enabled by two technical contributions presented in this paper.

First, we make several improvements to a state-of-the-art multiview stereo algorithm in order to compute view-dependent depths using video, photographs, and structure-from-motion data. Second, we present a novel image-based rendering algorithm that can re-render the input video using the appearance of the photographs while preserving certain temporal dynamics such as specularities and dynamic scene lighting.

Some of the video enhancements produced by our system
Given a low quality video of a static scene (top row) and a few high quality photographs of the scene, our system can automatically produce a variety of video enhancements (bottom row). Enhancements include the transfer of photographic qualities such as high resolution, high dynamic range, and better exposure from photographs to video. The video can also be edited in a variety of ways (e.g., object touchup, object removal) by simply editing a few photographs or video frames.

Click to enlarge

Click to watch the video

More information

3-D Ghosting

Ghosting is the result of imperfect optical components in the 3D projection system. (The system consists of everything between and including the projector to the viewer's eyeball.)

In polarized systems it is the result of polarizers that aren't perfect, light scattering sources like dirty port windows and dust in the air, and the scattering of the silver screen. Today, polarizers are achieving very extinction ratios (providing quite pure polarization) in the range of >1000:1 and still yielding high transmission. The primary source of scattering is the screen. The typical "native" ghost performance in a polarized system is close to 100:1, accounting for all factors in the light path.

In shuttered systems (shutter glasses), the source of ghosting is the imperfect operation of the liquid crystal in the shutter glasses, coupled with an angular dependence of the light going through the lenses. Straight through, shutter glasses used in the theatres today run >200:1. Some will produce greater than 1000:1

The spectral division technology (Dolby's Infitec) ghosts because the color separation from left eye to right eye isn't perfect - there is a finite overlap on the filters which allows a leakage from one eye to the other. The color filters for this system are created by thin film interference filters, and as such change their color characteristics if the light does not go straight through the filter. Dolby has addressed this by using curved lenses so when you look out of the edge of the glasses, the light still goes straight through (more or less) the thin films, maintaining color. I have not recently measured a Dolby system, so can't comment on the physical leakage.

In the REAL D circular polarized system, with testing, we deemed the physical leakage causing ghosting to be too large to provide a consistently good performance. I developed the ghost buster as a means to provide better performance. Fundamentally it works as Simon Burley stated: "...subtract a proportion of the left image from the right and vice-versa, ..." The portion of the image to be subtracted is determined by a model we created to characterize the leakage in a typical theatre system. It should be noted that under certain conditions, the ghost busting pre-processing is capable of completely eliminating the ghost image. (The model is deterministic.) In practice, ghosting will still occur at the edges of the theatre, and in extremely high contrast situations with large separation, but in almost all cases is substantially improved by the process.

REAL D has a hardware box that performs ghost busting in real time over dual HDSDI links, at full 2K resolution (2048). This is deployed in a number of post production houses in the US and Europe to support mastering of 3D films. Current practice is to do final color timing of the film while looking at the images at theatrical light levels (nominal 4.5fL +/-1) using a 3D system. REAL D's system is used in most of the post houses - as it represents >95% of the US market share of theatres. Final color timing is done with the ghost buster in place at the input of the projector in the timing suite (and everyone is wearing glasses.) The resultant master does not have ghost busting in it. For REAL D theatre distribution, the master is then played in real time through the ghost buster, and saved to DDR, creating the ghost busted master.

We (REAL D) recognize the difficulties in distribution logistics to have GB masters and non-GB masters. We are moving to put real time ghost busting into the theatres, so a single non ghost busted master will play properly. This has the advantage that the ghost busting processing can be tuned to the specific theatre for optimum results. (not ready to announce a timeline for this deployment).

By Matt Cowan, Chief Scientist REAL D, CML 3D

Quantel Stereo3D Broadcast Server

Quantel will be showing during IBC a technology preview of the sQ Server working with Stereo3D content. The aim is to demonstrate the inherent capability of the sQ platform and stimulate discussion with customers who are excited about Stereo3D broadcasting. The sQ server will be recording and playing back two streams of HD in perfect frame lock using the existing sQ Record and sQ Play control applications. Live HD Stereo3D content will be recorded from the 3ality Digital camera rig on the stand. The server uses the high quality AVC-I 100 compression format to ensure the best possible stereo experience. The recorded 3D clip will be treated operationally as a single entry and will be available in the library and to sQ Play for playout. Several playout modes are supported including dual channel, single channel side by side for Hyundai Xpol 3D displays and single channel picture-in-picture for confidence review on conventional HD displays.

Source: Quantel

Quantel - RED Camera support on eQ, iQ and Pablo

Our new RED importer allows import of the raw r3d files produced by the RED camera.

Key features are:
- Preview r3d file before importing.
- Partial file import - set in and out points and import between them.
- Import at 4K, 2K or 1k resolution (full, half, and quarter res) with different quality options.
- Full use of all the RED options including exposure, colorspace, color temperature and more.

The RED importer will be available in V4.0 software during October 08 and will be free of charge to customers with V4.

Previewing an r3d file before it is imported

Some of the many RED import options

Source: Quantel

DDD Signs Development Agreement with Hyundai IT to Expand 3D LCD TV Range

DDD Group plc, the 3D consumer solutions and content company, today announces that it has signed a development agreement with Hyundai IT Corp. ("Hyundai") to expand Hyundai's range of 3D HDTVs that incorporate DDD's TriDef Core embedded hardware solution.

Under the terms of the agreement, DDD will integrate the TriDef Core with Hyundai's range of 32" LCD TVs that incorporate Arisawa Manufacturing Company's X-Pol 3D optical material. The new 32" models are scheduled for launch during October in Japan to support the BS11 3D television programming from Nippon BS Broadcasting Corporation. DDD will also deliver a version of the TriDef Core compatible with European broadcast formats.

TriDef Core is a custom circuit board that integrates with the existing 2D video electronics in the LCD HDTV. 3D features enabled by the TriDef Core processor include decoding the BS11 3D broadcast signal format, playback of specially encoded high definition Blu-ray discs and real time 2D to 3D conversion of standard and high definition content.

Using Arisawa's innovative 3D optics, the flat screen 3D televisions are capable of displaying conventional 2D pictures as well as 3D. When the viewer decides to watch in 3D, they simply activate the 3D functions using their remote control and put on a pair of 3D glasses. The system then delivers a 3D image from any viewing position in the living room with the same clarity and quality as the latest 3D digital cinemas.

Source: Hemscott

Digital Technology at a Cinema Near You

Cinema of the future is coming to a theatre near you: The introduction of digital technologies makes the european film industry more creative and competive at every stage of production and distribution. Stereo cinema, big screen live events and holographic television are revolutionising the media Watch from scene to screen in this week’s Futuris.

Click to watch the video

Source: EuroNews

Stereo Converting 3-D Masses

Stereo Pictures is expanding its business as it begins to launch its process of converting movies to stereoscopic 3-D. According to Stereo co-president Phillip Rhee, the goal is to help studios convert libraries of titles to provide the content needed to move the 3-D business to theaters, homes and mobile devices. It also has its sights on new titles and the games market. The company is beginning technology deployment and recently brought in new partners: former Warner Bros. execs James R. Miller and Dan Romanelli.

Stereo proprietary technology, developed at its R&D site in Korea and designed to convert any digital format to a 3-D format, will be deployed in service bureaus in Los Angeles and Korea. Once set up, the company estimates that it could convert a title in as little as a month and a half. Filmmakers would participate in the conversion. "The marriage of technology and art is critical," Rhee said.

Miller had served as president of Warner Bros. worldwide theatrical business operations, and Romanelli was president of Warners worldwide consumer products. There are slight more than 1,000 3-D ready digital cinema screens now installed in the U.S., and early 3-D ready TV sets are starting to enter the market.

By Carolyn Giardina, The Hollywood Reporter

Panasonic AVC-INTRA Codec Supported by Avid

Panasonic Broadcast announced today that new editing solutions from Avid Technology, Inc. offer import and native editing support for its AVC-Intra codec. The Avid products supporting AVC-Intra include Avid Media Composer software Version 3.0; Avid Media Composer Mojo DX 3.0; Avid Media Composer Nitris DX v 3.0; Symphony Nitris DX v 3.0; Avid NewsCutter software v 7.0; NewsCutter Mojo DX v 7.0; and NewsCutter Nitris DX v 7.0.

AVC-Intra, the industry’s most advanced compression technology, provides higher-quality 10-bit intra-frame encoding utilizing the Hi-10 and Hi-422 Intra profiles of H.264 in two modes: AVC-Intra 100 for full-raster mastering video quality and AVC-Intra 50 Mbps for DVCPRO HD comparable quality at half the bit rate.

Panasonic’s more advanced, solid-state 2/3” P2 HD camcorders — the AJ-HPX3000 and AJ-HPX2000 — offer AVC-Intra capability, as does the versatile AJ-HPM110 P2 Mobile Recorder/Player. Avid’s latest editing solutions enable AVC-I to be mixed and matched in real-time with other codecs such as Panasonic DVCPRO HD and Avid DNxHD (as long as the same edit rates are used), enabling high performance workflows in multi-format environments.

Source: StudioDaily

Movie Labels to Launch New “Open Market” Play Anywhere Scheme as Last Ditch Effort to Save DRM

Most of the big movie studios and many online movie retailers are preparing to launch a new initiative tentatively called "Open Market", first proposed last year by Sony Pictures, we’ve learned. All of the major studios besides those associated with Walt Disney are already on board and will be part of the announcements made next month. Sony Pictures CTO Mitch Singer’s presentation to industry participants supporting Open Market is available here.

Open Market is a set of policy decisions and a software and services framework that will allow interoperability of various formats and DRM schemes that are currently splintering the market. That splintering locks users into a single store and format, and is putting a stranglehold on widespread adoption of movie sales online. Multiple sources have indicated that the studios are putting their weight behind the initiative to avoid the fate of the music industry and as a last ditch effort to stop or slow non-DRM movie sales.

The industry has been flat out unable to agree on DRM interoperability (the Coral Consortium was the primary hope in this area and has largely stalled). Instead, Sony Pictures proposed "Open Market", which will allow play anywhere functionality more from a policy perspective and less from any technical fix to make DRM schemes interoperable.

A key part of Open Market will be a neutral third party to manage device registrations and movie purchases/rentals to ensure interoperability. This “domain” provider will manage services that let users register devices (PCs, televisions, mobile devices, etc.). Any movie purchased from any service provider can then be watched on a registered device.

Supposedly a whole slew of companies are supporting the effort. Fox, Paramount, Sony, Universal and Time Warner are on board. Retailers like Amazon, Target, WalMart, Comcast, MovieLink and CinemaNow are also said to be participating. Notably absent is Apple and the various Walt Disney studios (Pixar, Touchstone, Miramax, etc.), which are strongly backing the iTunes/Fairplay scheme.

By Michael Arrington, TechCrunch

Maximum Throughput Builds a Better Sledgehammer

Europeans will get their first look at the latest version of Maximum Throughput’s Sledgehammer server, with its built-in MAXmedia browser-based media management application at IBC in Amsterdam. It adds new features that address performance, collaborative workflows, media assembly, media search and retrieval, metadata compatibility and device integration, and format conversion and encoding.

The new V3.6 Sledgehammer delivers an aggregate, sustained network throughput of more than 700 MB/s. It can allocate guaranteed network bandwidth for dedicated clients such as editing and effects workstations that require sustained real-time streams. Additionally, the Sledgehammer HD!O model adds support for real-time capture and playback of 1080/50p, 1080/59.94p, and 1080/60p at 4:4:4 12-bit. Closed Captioning, keycode and production metadata are supported.

Sledgehammer is a Network Attached Storage (NAS) file server with built-in media management and preparation tools. With its real-time video I/O and control capabilities, combined with file-sharing and media processing support, the product accommodates both tape-based and file-based workflows.

The new software release of Sledgehammer’s MAXmedia application adds MXF encoding support for format conversion while allowing dynamic preview of video conversion. It also adds capabilities to the timeline, including the ability to drag and drop clips to and from any location on the timeline, to display and magnify audio track waveforms, and to trim and slip segments within the timeline, as well as several keyboard shortcuts and modifiers, enabling faster media sequencing and editing.

Sledgehammer file servers are available in either the NAS configuration or in the HD!O model. NAS file servers can be upgraded to Sledgehammer HD!O (SD, HD, and 2K/4K film resolutions) for real-time capture of uncompressed SD or HD video at 8-bit,10-bit or 12-bit color depth.

By Michael Grotticelli, StudioDaily

Inlet H.264 Encoding Distributes One Live Feed to Many Web Users

Distributing high-definition video over the Internet was once looked at as a futile proposition, because most people didn’t have large computer screens, and the bandwidth necessary to send these data-intensive files was virtually non-existent (at any price). Well those days are history. Now, content creators can offer online viewers a “lean-back” experience over the Web, thanks to advancements in compression algorithms and software acceleration.

Currently, nice-looking HD video can be delivered at a minimum of 1.25 Mbps at 960x540 (slightly smaller than native 720p) resolution. It is, after all, web-based HD video, but the better the available data rate, the better the resolution. There is no set standard defining “HD video” on the web.

To meet the imminent demand—and reach the thousands of people using applications built on Adobe Flash technology—one company now offers a single-box encoder capable of streaming live, HD H.264 (MPEG-4) video to users of Adobe Flash Media Streaming Server software. The Adobe software allows content distributors to stream protected, high-quality video on the web and to mobile devices. Using the Flash media format provides access to these HD files for millions. (Adobe reports that 85 percent of its users have upgraded to the new H.264 version of Flash since it was introduced last year.)

Inlet Technologies, based in Raleigh, NC, has added support for the format in its Spinnaker line, according to director of product management Andy Beach. The model 7000 is a real-time streaming encoder that works with Adobe Flash Media Server 3 to deliver streaming content to Adobe Flash Player users. Designed for streaming live feeds, the product is a result of Inlet’s work with Adobe as a member of the Flash Media Solution Providers Program. Major League Baseball uses the SD version of the Spinnaker encoder for its online games feeds.

Inlet Spinnaker 7000

Accommodating a wide variety of users, Spinnaker takes in a single live feed and outputs Flash H.264, VP6, VC-1 and Silverlight streams simultaneously, allowing content providers to reach virtually any user on any device. The encoder also supports multiple bit rates, allowing users to simultaneously deliver content streams to up to four different devices, such as a set-top box, computer and mobile device.

Users can also automate streaming for regularly scheduled events or remotely schedule streams in advance for special live events. In addition, the box features signal monitoring capabilities, providing automatic alerts to problems with picture, audio and more. Alerts are displayed on the Spinnaker device and can be emailed to remote users.

At $18,000 for HD streaming capability, the technology does not come cheap, but it looks to be popular with sports broadcasters, corporate video departments, government agencies and small-to-medium-sized media production companies.

By Michael Grotticelli, StudioDaily

CineForm Officially Supports Red One

CineForm has developed a command-line utility called R2CF.exe that will convert R3D files to CineForm files. Currently it is Windows (or Bootcamp) only, but will be made available for MacOS in the near future. Your CineForm files become your new master files for your online workflow and you can archive your R3D files as the digital negative source. R2CF allows you to choose either AVI or MOV wrappers depending on whether your intended post workflow is on Windows or Mac. Although R2CF does not yet work on Mac, the files it creates are fully Mac compatibile. BTW, R2CF will be integrated into HD Link (with its GUI) in the near future, eliminating the requirement to use a command line.

CINEFORM ENCODER: To convert into CineForm files you need a CineForm encoder product installed (Neo HD/4K or Prospect HD/4K) in addition to R2CF.

CONVERT: R2CF will convert single R3D files to CineForm files. R2CF is included in the CineForm --> Tools directory under Start Menu for Neo HD/4K and Prospect HD/4K (Build 179 or later). In the Tools Directory we also have batch scripts (.vbs) that perform batch conversion on folders full of R3D files. Parameters within batch scripts are easily edited by right-clicking on the .vbs file (Windows Explorer) and selecting Edit.

EXPRESS FILES: With the introduction of R2CF we are also introducing a new workflow element called "CineForm Express" files. Express files are color-perfect "mini" versions of full-res CineForm files. An Express file is typically a 1/4 (horizontal and vertical) resolution version of the source file. For 4K (2:1) R3D source the Express file is 1024x512. It is about 3MB/sec in size. It also shares the same file GUID (Globally Unique Identifier, which we use for Active Metadata) with the full-res CineForm Master file, so it supports all NLE color correction, transitions, effects, and CineForm Active Metadata identically to the Cineform Master file. When used in a 1080p Premiere Pro project, Express files are properly scaled for HD-SDI output using an AJA Xena card.

The idea is that you can create CineForm Express files much faster than real time on a modern dual quad machine and then immediately begin real-time, multi-stream editing work identically to the way you would with a full-res CineForm Master file, including HD-SDI support. But the files are much smaller and have extremely high performance, lending easy RT editing even on a modest laptop.

Whenever it is convenient in your workflow you can unlink the Express files and re-link the full-res Master files to complete your online edit at your final mastering resolution while retaining effects, transitions, color correction, etc - no proxy files, EDLs, or conforming required.

R3D METADATA: All R3D metadata is copied and inserted into CineForm (both Express and Master) files. White Balance is formatted as Active Metadata. Timecode data is also retained. All CineForm file formats (Express / RAW / 444 / 422) support Active Metadata, although you'll need Prospect 4K (Windows) or Neo 4K (Mac - temporarily called the Mac codec) to manipulate Active Metadata.

The Red One workflow page page offers more details plus instructions about how to use the tools. Feel free to give it a try. If you are not an existing CineForm customer you can download our Trial products that offer full functionality for 15 days.

Source: RedUser Forum

The Latest Gear Behind 3D Movie Making

DLP recently teamed up with former LucasFilms effects studios Kerner Optical and Tippett Studios to work on a stereoscopic 3D Trailer for their cinema projectors. Pushing this style of filmmaking forward was something they were all excited about and they took me around Kerner Optical's facility and showed me the latest camera gear they used to make the 3D trailer.

Seeing the camera setup in person is pretty awesome. It consists of two cameras -- one facing straight forward in a horizontal orientation, and one facing straight down, in a vertical position. Between them sits a mirror, angled at roughly 45 degrees, that acts as a beam splitter, directing the image to the vertically facing camera and helping to create the 3D effect. While the vertical camera remains stationary, the horizontal camera slides from left to right. In doing so, the intensity of the 3D effect varies according to position as the pictures from the two cameras phase in and out. Once the camera has done its job, its up to the viewing apparatus to carry out the rest of the magic.

Kerner Optical uses special LCD monitors with the ability to display 3D images with the help of polarized glasses. Many rear-projection DLP televisions actually do the same thing, but a lack of content support has kept the technology obscured from most owners.

By Adrian Covert, Gizmodo

IRIDAS Brings RealTime RAW 2.0

IRIDAS, the world leader in RAW playback technologies, is bringing RealTime RAW 2.0 to the crowds at the San Jose Convention Center for NVISION 08 this week. On August 26 and 27 visitors to the PNY booth (200/305) can see the live playback of RAW content with pristine image quality using NVIDIA PNY graphics to provide de-Bayering in real-time. The technology means that filmmakers now have immediate access to their footage straight from the camera.

IRIDAS is the only developer offering universal support for all available digital cinema RAW camera formats. In addition to RAW playback, creative looks can be saved as metadata and used for graded playback right on the set. RealTime RAW 2.0 is available with all 2008 versions of FrameCycler and SpeedGrade.

Source: IRIDAS

Avid Advances in the Art of 3-D

Avid Technology has jumped into the stereoscopic 3-D arena. The company unveiled version 10 of its Avid DS finishing system, its first product to incorporate tools for stereoscopic filmmaking. This summer, a number of postproduction technologies that support 3-D were introduced by various manufacturers. Da Vinci Systems recently revealed that it will preview its R3D color grading system designed with 3-D support next month at the International Broadcasting Convention.

At the recent Siggraph confab, Autodesk Media & Entertainment introduced two new versions of its products that will have stereo tools: Maya 2009, a computer animation software; and Toxik 2009, a compositing system. Meanwhile some 3-D capable post tools already are on the market, including those for Quantel and Assimilate.

Avid DS supports such key functions as conform and color correction. Version 10 includes a new stereoscopic container as well as a redesigned color management tool interface and support for Avid's infrastructure products.

By Carolyn Giardina, The Hollywood Reporter

Digital Update

Fall 2008 marks three years since the rebirth of cinematic 3D and the initial installations of approximately 100 screens for Disney’s Chicken Little. There are now approximately 1,800 3D digital screens worldwide, with equipment from three primary vendors, Real D, Dolby and XpanD, and that number is expected to increase dramatically in the next year. As digital 3D has matured, we have also seen new names pop up, including MasterImage, ColorCode and Sensio, each with plans to extend their brand and products into the cinema exhibition market.

While the choices in 3D equipment from vendors continue to grow, it is the studios, their titles, and the great box-office results that are driving 3D conversions. Both Disney and DreamWorks Animation have made substantial commitments to supply a full slate of 3D content, with approximately 30 titles announced so far. Disney has announced two major Christmas titles will be available in 3D: Bolt and the traditional re-release of The Nightmare Before Christmas. In 2009, Disney will be releasing Jonas Brothers, G-Force and others including the re-release of the Toy Story trilogy. DreamWorks Animation is expecting a significant push for 3D with their May 2009 release of Monsters vs. Aliens, and likely every animated title will be in 3D thereafter. Warner Bros., through their New Line division, recently made history by releasing the first new live-action 3D title, Journey to the Center of the Earth, and reports a phenomenal 3.7 times the box office over 2D screens. And in 2009, 20th Century Fox is releasing Ice Age 3: Dawn of the Dinosaurs in July and, of course, the highly anticipated 3D release of James Cameron’s Avatar in December.

While the production pipeline is gearing up for an increase in 3D titles, the question remains, particularly with distributors, whether there will be enough 3D screens to accommodate the expected 2009 releases. Fortunately, based on the recently announced commitments from the leading 3D technology providers, the answer is a clear yes.

Real D
Real D, with approximately 1,400 screens installed, continues to lead with an impressive number of commitments from U.S. and international exhibitors. Just recently, Regal Entertainment committed to 1,500 Real D screens. Cinemark, also a partner in the Digital Cinema Implementation Partners (DCIP) rollout, followed with a commitment to add an additional 1,500 screens over the next three years.

In Canada, Cineplex Entertainment has agreed to increase their total from 41 to 175 Real D systems and Mexico’s Cinepolis has committed to adding 500 Real D 3D systems by the end of 2010. These additions, along with Odeon’s plans for up to 500 Real D systems in Europe, brings Real D’s total commitments to over 5,500 systems in the 2009 and 2010 timeframe.

Real D’s engineering team has been busy finishing off their most recent innovation, the Real D XL “light-doubler”, an optical assembly that fits in front of the standard DLP Cinema projector, replacing their previous “Z-screen” shutter assembly. The XL filter recovers most of the light lost in the polarization process, allowing Real D 3D to be used with a single projector on screens as wide as 60 feet. Real D XL addresses a need in the market as more and more 3D content becomes available and exhibitors want to play these attractions in 3D on their larger screens. Michael Lewis, Real D 3D chairman and CEO, says, "Real D is committed to 3D cinema and to innovation. 3D is all we do—and this expertise allows us to continually upgrade our cinema technology, providing the absolute best 3D experience available. Real D XL is the latest example of that."

Dolby Digital 3D
Dolby Digital 3D also continues to make steady progress, particularly in the international market, with around 200 systems installed worldwide, and commitments for considerably more in the coming months. "Momentum for the Dolby 3D system is growing, as exhibitors around the world are seeing the immediate benefits of our digital 3D solution," states Peter Seagger, VP of international sales. "Not only does our high-quality system create an incredible moviegoing experience for customers, but it's also easy for exhibitors to integrate in their theatres.” Unlike the polarized systems (Real D and MasterImage), Dolby’s process uses a white screen but more expensive passive glasses that can be used around 400 times, decreasing their per-screening cost.

Dolby has recently initiated a licensing program to allow the Dolby 3D color processing necessary for correct playback to be incorporated into servers from other companies, thereby relieving the requirement that Dolby 3D playback be built on a Dolby Digital server. At the recent Cinema Expo in Amsterdam, XDC demonstrated that they can now play Dolby 3D using their new CineStore Solo G3 server. It is expected that other server companies will follow XDC by offering Dolby 3D-playable capability with their equipment.

XpanD is a relatively new name in the North American cinema market, although fairly well-known in Europe, where they have approximately 200 3D installations in commercial theatres. In June 2008, XpanD announced it first U.S. commitment with the sale of 12 systems to Marcus Theatres. “Marcus Theatres is pleased to offer our guests a unique 3D experience at these 12 locations utilizing XpanD’s active glasses,” said Marcus president Bruce J. Olson. “This innovative technology provides another way for us to attract more moviegoers to our theatres. We look forward to working together to create the ultimate movie experience.”

XpanD uses active LCD shutter glasses and therefore does not require a silver screen to maintain polarization. In the spring of 2008, XpanD acquired the assets of NuVison Technologies of Beaverton, Oregon, the world’s leading manufacturer of active 3D glasses, and opened a North American sales and support office in Los Angeles.

Although 3D systems using active glasses are popular with parks and special venues, they are just catching on with commercial exhibitors primarily because the glasses are becoming less expensive, lighter, and generally more user-friendly. Although XpanD’s current glasses are priced comparable to Dolby’s passive glasses, they also have a finite lifetime of around 300 to 400 movies due to their internal sealed battery. At Cinema Expo, XpanD showed concept drawings of their next-generation active glasses that will have replaceable batteries and an RFID security chip. Deliveries are planned for this fall.

Exhibitors like the additional flexibility of being able to move XpanD 3D between auditoriums by just equipping screens with the IR emitters and moving the glasses, rather than having dedicated 3D auditoriums. When it comes to glasses, Real D has an advantage in that their glasses are low-cost, single-use giveaways, while both Dolby and XpanD use more expensive eyewear that must be collected and reused between shows. Typically, the glasses are washed between shows in industrial dishwashers; however, some European exhibitors have found their customers prefer cleaning them with disposable wipes when provided. This not only saves the time and cost of processing the glasses between shows, but allows exhibitors to recover the costs of the wipes by selling ads on the packages.

MasterImage, a Korean 3D technology provider, has also introduced their 3D systems in approximately 20 theatres in the Asian market and is testing in a handful of North American theatres. The MasterImage system is somewhat similar to Real D’s alternating circular polarized approach in that it uses a silver screen along with low-cost glasses. Instead of an electronic shutter, MasterImage uses a mechanical shutter wheel mounted near the projection lens. Although similar in principle, the glasses are not compatible with Real D’s. MasterImage claims their system is lower in cost with no ongoing licensing fees and no restriction on moving systems between auditoriums.

All of the current 3D systems are designed to be used with a single projector, but also with two side-by-side projectors when needed. The dual-projector systems provide more light for larger screens and arguably a better image, because triple-flashing the image through one projector can be eliminated. But dual projectors are more difficult to keep in tight alignment necessary for accurate 3D images and, therefore, some studios have discouraged their use except on a case-by-case basis.

Another company making 3D news is Sensio, from Montreal. Unlike the others, Sensio is not a 3D format but a transport system that encodes the two separate eye channels into one data channel so it can be efficiently sent through existing satellite links. Sensio uses what is said to be visually lossless spatial compression between the two eye channels to reduce the bandwidth to that of a single channel. In the cinema, Sensio equipment decodes the transmitted data back into the left-eye and right-eye channels, where they can be fed into the projector equipped for a specific 3D system, such as Real D, Dolby or XpanD. Therefore, Sensio 3D is a delivery technique that works upstream from the existing 3D formats.

Recently, International Datacasting Corporation, (IDC), partnered with AccessIT, announced the initial deployment of the “CineLive” solution utilizing Sensio decoder technology to 50 theatres across the United States, with plans to increase deployment to 150 theatres by the end of 2008. “With this first implementation, we have secured our lead over the competition and increased our visibility, as we have agreement that the Sensio brand will be shown during every presentation,” stated Nicholas Routhier, Sensio president and CEO.

ColorCode 3D
ColorCode 3D is another new name in the cinema market. Based in Denmark, ColorCode has been showing their 3D process to Hollywood studios, although they have yet to demonstrate feature content on a large screen. ColorCode 3D is often described as “super-anaglyph” in that it uses low-cost color filter glasses. The twist is that the left- and right-eye channels are combined into a single channel using a proprietary ColorCode 3D encoder during production and the encoded image looks quasi-compatible in 2D. In other words, without the glasses, the image is said to look “almost” as good as the normal 2D image.

ColorCode points out that their process requires no changes in the playback system and that it works with any standard digital projector and screen, with film prints, or consumer formats such as DVD, Blu-Ray, and even with printed materials. In theory, the one-sheets, advertising and consumer release could also be viewed in 3D using the same glasses. Of course, it remains to be seen whether viewers will accept the compromises in the 2D and 3D images in order to obtain this cross-compatibility. For the cinema, it is unlikely that critical filmmakers will accept anything less than perfect images, but for the consumer markets, ColorCode 3D may offer some interesting possibilities.

With the number of titles and equipment choices growing, adding 3D should be a central part of any exhibitor’s digital deployment plans. The choice of equipment is an individual decision, as each 3D format offers its own unique set of advantages that may better match a particular exhibitor’s or screen’s requirements.

By Bill Mead, Film Journal International

Looking Sharp

In June 2004, Sony took the bold step to introduce to the cinema exhibition industry its first 4K projector, the SRX-R110. Unlike the majority of digital projectors being installed which used Texas Instruments’ 2K DLP Cinema technology, the Sony SRX-R110 projector used Sony’s own proprietary imaging device—known as SXRD. SXRD offered twice the horizontal and vertical pixel count of the 2K projectors. This in theory results in sharper pictures, which Sony and many others in the industry believe are ultimately necessary for movie theatres to differentiate themselves from consumer-level HD systems that now achieve near-2K resolution in the home.

While the SRX-R110 was possibly ahead of its time when demonstrated at industry events as an evolving prototype, it served the function of introducing 4K images and techniques to the industry and defining what features would be needed in Sony’s second-generation 4K cinema projectors. While most in the industry agree that the 4K projector can produce stunning images, its primary benefits are only relevant to exhibitors if there is a steady supply of 4K titles. The DCI specification elegantly accommodates 4K as a layer above 2K to allow single-inventory distribution, but creating 4K masters remains an industry challenge. Most Hollywood production processes are based on a 2K workflow and 4K adds a fairly significant time and cost bottleneck to the process.

“Sony is committed to streamlining the 4K workflow pipeline and is devoting considerable resources to developing the industry tools necessary to make quicker 4K production a reality in the next few years”, says Gary Johns, Sony’s VP in charge of its U.S. digital-cinema division. “Regarding titles and theatres, it’s the classic ‘chicken-and-egg’ dilemma. As we get more 4K theatres, we’ll see more 4K titles.” Sony demonstrated their ability to generate 4K content with the recently released Sony Pictures Hancock and Spider-Man 3 in 4K.

In the past four years, Sony has continued to refine its 4K cinema product line with new products, the latest being the SRX-R220 series, first shown at ShowEast in October 2006. While the earlier SRX-R110 was designed for a wide variety of uses, the newer SRX-R220 series was intended specifically for movie theatres and incorporated the security and DCI media block required to show Hollywood content.

In March 2007, Sony announced it had secured an arrangement with Muvico Theaters to equip its new 18-screen multiplex in Rosemont, Illinois, with the SXR-R220 projectors. As this was the supplier’s first multiplex-wide commitment, Sony worked closely with Muvico to ensure the Rosemont installations would be an ideal showcase for 4K and Sony’s other cinema-related products and services. The Muvico Rosemont installations were completed in the summer of 2007.

In early 2008, to facilitate the ongoing digital-cinema deployments, Sony created a new group based in Los Angeles, headed by Sony veteran Mike Fidler. The new Digital Cinema Services and Solutions Division (DCSS) plays an industry role similar to other third-party system integrators, in that they are providing initial system integration, after-sales service and support, as well as offering exhibitors attractive financing packages that will include studio conversion incentives. “The Sony DCSS group not only provides support for Sony’s 4K products but brings exhibitors solutions for pre-show advertising, lobby display, and offers a full-spectrum of Sony solutions,” says Fidler. “The Sony DCSS division also serves as a bridge between the Hollywood studios and Sony’s other worldwide sales divisions in providing conversion incentives for cinemas in the overseas markets.”

Last year, AMC Entertainment announced that they would be equipping 54 screens at four of their new sites with the Sony SXR-R220 projectors, and as of July 2008 AMC has installed 132 in North American cinemas. Overall, Sony has approximately 200 4K SXRD projectors installed in commercial cinemas worldwide. AMC, being a partner in the Digital Cinema Implementation Partners (DCIP) initiative, is therefore part of the rollout being planned for the DCIP partner companies, AMC, Regal and Cinemark. It is quite likely that as DCIP’s plans mature, AMC’s initial Sony 4K installations will be incorporated with the ongoing DCIP deployment.

Sony 3D
While Sony has been counting on 4K to be a significant differentiator, the market has turned to 3D, which produces instant and obvious results for the audience. The current 3D formats have been designed around the current DLP Cinema technology, where a single projector can fairly easily accommodate the increased frame rate 3D requires. With the SXRD devices, single-projector 3D using the conventional triple-flash solution is considerably more difficult to implement, so Sony’s engineering team has had to devise some unique and creative solutions.

The DCI specification supports 2D in 4K at 24 fps and in 2K at 48 fps. The current 3D system implements 3D within the DCI-specific package by interleaving the left-eye and right-eye channels into a 2K 48 fps package. Therefore, all current DCI 3D implementations are limited to 2K to meet the distribution package specification.

Sony views the DCI-imposed 2K limitation on 3D as an opportunity. Since they have a 4K-wide imaging device, they have cleverly devised a way to use its full height to display both the left-eye and right eye images full-time, in parallel, top-and-bottom on the SXRD display chips. In conjunction, they have developed a special dual-lens assembly that works with the SRX-R220 projector that repositions the two images into one 3D image at the screen.

Sony has been quietly working with both Real D as well as Dolby in developing the filters for their 3D adapter, with the objective of being able to offer solutions that accommodate both Real D’s circular polarized process and the Dolby-Infitec tri-filter process. Therefore, Sony’s 3D solution will be compatible with the existing 3D formats and glasses.

“With the addition of the 3D adaptor to our lineup, Sony can now offer the best of all worlds to exhibition: stunning 4K imagery from 4K movies; incredible 3D, with no triple-flash artifacts; and upscaled 2K movies,” Gary Johns asserts.

At Cinema Expo in Amsterdam, Sony was privately demonstrating their 3D adapter for the SRX-R220 projector to key exhibitors, and the company plans to have a demonstration system installed in Los Angeles in the near future. And, of course, look for the Sony 3D-enabled projector at ShowEast 2008.

By Bill Mead, Film Journal International

The Rollout Rolls On

After months of inching along, the U.S. digital-cinema rollout is finally showing signs of momentum again as we move into the second phase of deployments with what will eventually amount to over a billion dollars of equipment purchased. In July 2008, Digital Cinema Implementation Partners (DCIP), the consortium made up of Regal Entertainment, AMC Theatres and Cinemark and representing about 35% of the U.S. market, struck its first major conversion incentive deal with 20th Century Fox, with the other studios expected to follow.

Based on successfully concluding similar incentive deals with the other studios, DCIP expects to begin deploying digital systems in its partners’ theatres in the fourth quarter of 2008, starting at a rate of around 200 to 300 systems per month for the next three to five years. Representatives of DCIP would not comment on further details except to say that more announcements are expected in the near future.

Last March, the Cinema Buying Group, sanctioned by the National Association of Theatre Owners (known as NATO/CBG), announced they had selected AccessIT as their digital-cinema provider. With over 600 NATO/CBG members and representing over 8,000 screens, the CBG deployment represents approximately 20% of the North American market. Although there has been little outward progress, AccessIT’s representatives have been working behind the scenes to prepare for the NATO/CBG rollout. With their “phase-two” studio deals in place—providing incentives for up to 10,000 North American screens, recent efforts have been ongoing to define the theatre qualifications and profile the individual requirements of the NATO/CBG members. With the preparation nearing completion, the first installations are expected to begin later this fall.

For the past year, the digital-cinema rollout in the USA has essentially been stuck between what has often been described as phase one (the early adopters) and phase two (the mainstream exhibitors). Early adopters, exhibitors who saw value of being the first and desired experience in operating digital theatres, account for approximately 5,000 systems completed in the U.S. during 2005 and 2006. Many of the larger exhibitors, particularly the DCIP partners, chose to wait until the industry had matured and the initial start-up issues had been resolved.

The most fundamental issue is conversion cost. From the early discussions of the digital conversion in 2001, it became apparent that digital would fundamentally change the business. With distributors reaping the potential savings, but exhibitors having to bear the purchase and maintenance costs of more complex projection systems, the questions of “Who will pay?” resonated throughout the industry. In response, the distributors offered a general answer of “We will,” but were short on specifics on just how this would be done.

By the fall of 2005, the third-party digital-cinema providers, primarily AccessIT and Technicolor Digital Cinema (TDC), announced the first deployment plans, which provided studio incentives when a digital copy replaced a 35mm print. These deals are now referred to as virtual print fee (VPF) arrangements. The phase-one

VPF-based programs were appropriate for the early adopters, but could not necessarily be extended across the mainstream market. The studios were reluctant to commit to the same level of incentives for the broader market as they did for the initial adopters. While the potential savings from not making 35mm prints is real, during the changeover period studios had to support both film and digital, which resulted in increased costs. The distributors viewed their initial VPF deals as short-term incentives needed to jump-start the conversion process, but not necessarily as an ongoing funding mechanism.

VPF incentives typically require a third-party digital-cinema provider to negotiate the studio deals, finance the equipment, collect the fees and oversee the process. While this solution works well for many exhibitors, it wasn’t the ideal solution for everyone. Some exhibitors are able to secure their own financing and have in-house technical teams fully capable of supporting their own deployments. Exhibitors are looking for an ongoing solution that would retain their independence and help them offset the potentially higher costs of digital projection. In the past, with high-cost equipment and with little track record to base their decisions on, most exhibitors could not begin to estimate the overall costs (or savings) of moving to digital projection. As the industry matures, however, we are now able to fill in many of the blanks and more accurately evaluate the factors leading to rational and justified conversion decisions.

Today, with the third-generation digital projectors and servers, the costs of a basic digital system are approaching 50% of the equivalent from five years ago, a significant reduction that is putting digital within reach of the mainstream exhibitors. In addition, with years of operating experience, digital is showing savings in overall theatre operations, especially when the entire multiplex—or entire circuit—is converted over. While maintaining digital projection and 35mm side-by-side offers flexibility, the true savings of digital projection begin when the 35mm equipment is no longer needed.

A second and somewhat unexpected factor is the re-emergence of 3D. With the proven popularity of today’s 3D presentations, and the increase in available titles, 3D now has to be considered an essential part of any exhibitor’s digital-cinema deployment plans. While 3D certainly puts more life in the box office, the decision to add 3D equipment causes exhibitors to consider which of the various systems to add, and how many of their digital screens should be equipped with 3D. From a cost standpoint, 3D is a mixed blessing. In the past five years, the cost of a basic digital-cinema playback system has declined, but the cost of adding 3D equipment now reverses some of the savings gained by lower-cost digital projectors.

So far, 3D has proven to be a moneymaker, with box-office results reportedly in the 2X to 4X range over 2D screens. Exhibitors can justify the addition of digital and 3D for those screens where the 3D titles are expected to play well. 3D has allowed exhibitors to localize the purchase decision to a screen-by-screen basis, where the expense of 3D digital equipment can often be justified in the short term by the box-office returns. But these spectacular box-office gains may be short-lived. As the industry matures, more 3D screens are equipped, and more tentpole titles become available, it is likely that the audience will come to expect digital 3D as an essential amenity, much like 5.1 digital surround sound is expected today. After the “mainstream” exhibitors are equipped, it may be just too late for many of the smaller exhibitors to see these returns.

As we approach the mainstream rollout, we are also nearing what might be viewed as the “critical-mass” point of digital exhibition, where what was previously considered new and possibly a novelty will become standard and expected. All signs are now pointing to the mainstream conversion beginning late this year or early 2009, so exhibitors who have not developed their individual plans need to carefully consider how they will fit in with the rest of the industry’s conversion time frame.

By Bill Mead, Film Journal International

Philips Demonstrates 3D on Blu-ray

Philips will demonstrate 3D on Blu-ray at IFA 2008 (August 29 – September 3). The demo shows that its 2D-plus-Depth content format can be applied to Blu-ray thereby enabling a great 3D movie viewing experience on a variety of displays.

With the growing amount of 3D movies currently on theater release, people now want to enjoy the 3D viewing experience in their homes as well. In order to bring high quality 3D content to the home, the 2D-plus-Depth format offers a solution that can be applied to Blu-ray. Subsequently, these discs can be enjoyed on both stereoscopic (special glasses needed) as well as auto-stereoscopic 3D displays, which do not require glasses.

Philips 3D Solutions provides end-to-end 3D system solutions ranging from 3D displays for professional use, 3D content creation and conversion tools to technology licensing. Jos Swillens, CEO of Philips 3D Solutions: “This demonstration of 3D on Blu-ray is a clear proof point of the flexibility and sustainability of our 2D-plus-Depth content format. It can bring high quality 3D content to the home on a variety of displays and offers a solution to the need for interoperability in 3D”.

Philips 3D displays give the viewer an exciting, entertaining and engaging 3D experience making them ideally suited for professional use in digital signage for retail, point-of-sale advertising, gaming applications in casinos, games and 3D visualization. Thanks to the WOWvx technology used in Philips 3D Displays, no special viewing glasses are needed.

Alongside this 3D on Blu-ray demo, Philips 3D Solutions will be showing 3 new 3D display products for professional use, the 52”, the 22” and the 8”. These products will be commercially available from Q4 (52”) and Q1 (22” and 8”) onwards. The impressive 132” 3D WOWzone will also be on show at IFA 2008. All 3D demos and products can be seen in the Futurezone of the Philips booth, which is located in Hall 22A and B of the Messe Berlin, Germany.

Source: BroadcastBuyer

ViewSonic Showcases 120Hz Display Technology

ViewSonic Corp., a worldwide leader in display technologies, has once again demonstrated display leadership with the unveiling of its first 120Hz desktop LCD technology at NVIDIA’s NVISION 08 event in San Jose, Calif. The 22-inch 120Hz prototype delivers rich, colorful, blur-free video performance on traditional gaming, entertainment and graphic applications, while also delivering eye-popping stereoscopic 3D when used with NVIDIA’s GeForce Stereoscopic 3D gaming technology.

ViewSonic VX2265wm 120Hz LCD display

The 22-inch 120Hz display coupled with 3ms gray-to-gray response time provides much better Motion Picture Response Time (MPRT) than the typical “fast-response” displays on the market today, virtually removing the appearance of motion artifacts and ghosting. This makes it the LCD of choice for extreme gaming, entertainment, computer animation, precision graphic work and traditional computer applications. Features, such as integrated 2Wx2 stereo speakers and Dual Link DVI digital input, combine to expand entertainment options and make the monitor the must-have display for 3D gaming.

The display offers excellent front-of-screen performance, including Professional Color Certification, 1680x1050 resolution, 300 nits of high brightness and 1000:1 contrast ratio, as well as wide viewing angles for getting the most out of fast action games, downloaded video content and full-length movies. When coupled with NVIDIA’s GeForce Stereoscopic 3D technology, the ViewSonic 120Hz display provides game enthusiasts with realistic depth, intense motion, rich graphics and detailed images that literally leap off the screen.

The first displays with ViewSonic’s 120Hz technology are expected later in the year at select resellers, retailers and etailers. Pricing is not yet available.

Source: ViewSonic

Stereoscopic 3D Gaming is Really Cool

According to Nvidia, one of the next big things for the visual computing industry is stereoscopic 3D gaming. Jen-Hsun introduced the concept during his opening keynote speech and in many ways, it’s very similar to what Intel announced with DreamWorks last week. However, instead of being focused on the movie industry, Nvidia wants to bring this technology to gamers.

Down on the show floor, both ViewSonic and Mitsubishi have been demoing stereoscopic screens in conjunction with Nvidia’s new GeForce Stereoscopic 3D technology. ViewSonic’s display is a new 22-inch display with a 120Hz refresh rate and a 1,680 x 1,050 native resolution – it should be available worldwide in the next couple of quarters all being well.

ViewSonic's 120Hz LCD Display

Mitsubishi, on the other hand, had a 72-inch DLP TV using the same technology and it’s already shipping. However, Mitsubishi said that we’re unlikely to see this TV in the UK because “the European market doesn’t like big TVs,” apparently...

The technology relies on 3D shutter glasses designed by Nvidia and what’s interesting is that, unlike any other attempt at 3D display technology, the glasses don’t use polarised lenses. Instead, they use mini LCD screens that sync with the PC via an infra red sensor that sits down by your keyboard – this ensures that the correct image is sent to each eye and there is no loss of resolution, claims Nvidia.

Nvidia's Liquid Crystal Shutter Glasses

Nvidia's Infrared Emitter

Upon using the glasses, it was clear that the game looked appreciably sharper than what has come before and, more importantly, you can adjust the depth of the effect using a simple wheel on the back of the sensing device. This should go a long way to alleviating eyestrain caused by so many other attempts at delivering a truly 3D experience, but I still have my doubts for the technology.

By Tim Smalley, Bit-Tec

Doremi Technologies - GHX-10-3D

Doremi GHX-10 features HDMI, DVI, and SDI connections that allow the conversion of any input to any output format or scan rate. It supports SD, HD and 2K video and employs high quality 12-bit bicubic interpolation to ensure the highest quality picture. Advanced features include audio, sync output and genlock as well as dual-link SDI and 3Gb/s SDI connections for 4:4:4 2K film. Applications include computer DVI-to-HD-SDI conversion, HDMI or DVI resolution conversion, an HD/SD video upconverter/ downconverter, and much more. There is support for up to eight channels of AES, HDMI and SDI audio, embedded or de-embedded to separate AES BNC connectors, or delayed to match the video.

GHX-10-3D supports the TI DLP TV 3D technology now used in TVs from Samsung and Mitsubishi that display 3D in sequential mode at 120Hz. GHX-10-3D accepts regular 3D input from two HD-SDI links and delivers proprietary TI 3D format on DVI. So a DSV-J2 player playing 3D can be displayed using DLP TV 3D technology.

Source: InterBEE

Object Matrix - MatrixStore

MatrixStore is software that turns low-cost disk (Promise VTrack, Xserve RAID etc.) into a safe place to store your digital assets. It is a secure store to archive all your digital creations for as long as you want to keep them. It puts all your archived assets onto your network, available to search and use instantly.

MatrixStore has been designed for Professional Video and Film production and is ideal for Final Cut Pro and Final Cut Server workflows. It’s built on a cluster of Apple’s server and storage arrays. And it is a great complement for Xsan, offering long-term protection for the final cuts you need to keep. A SAN is perfect for provisioning high performance shared workspace for the editing and production process. Its architecture, management and costs are less suited to long-term protection of assets. Xsan is ideal for what your working now, MatrixStore for keeping original footage and final cuts safe.

MatrixStore can also integrate directly into other applications in your workflow through our open application programming interface (API). Or you can use our DropSpot desktop application to search, upload and download assets from MatrixStore allowing you to script and automate processes across Mac, Windows and Linux platforms and applications.

A MatrixStore license cost $1000 for every Terabyte of asses you need to store. You can download a 15TB MatrixStore license with our compliments to help get you started.

MatrixStore is different from other disk-based systems. It can manage a failure of a complete storage array while keeping your assets available for you to search and use. And if hardware fails MatrixStore re-protects all assets affected, putting a copy of them onto another independent piece of hardware in the system, automatically. You can also configure MatrixStore to replicate all your assets to another off-site MatrixStore to further enhance protection.

The management overhead of your archive becomes increasingly important as it grows. As does the task of keeping it safe for many years. MatrixStore is designed to lower these management overheads. It’s simple to set up, easy and fast to add capacity and it automatically backs up every asset you store. MatrixStore automatically re-protects all the assets affected by any hardware failures. All designed to lower the management requirement and make protecting hundreds of terabytes for many years as painless as possible.

Content Rules
MatrixStore makes it viable to put everything you’ve ever created onto your network. Metadata stays with your assets right through their lifecycle which means all your archived content is searchable and available instantly. MatrixStore integrates with Final Cut Server. It has an open interface (API) and can also play well with other Media Asset Managers and applications in your workflow.

Spatial View Launches New Wazabee Gaming Solution Bringing 3D Realism to Hardcore Gamers

Spatial View Inc., a leading developer of 3D image processing and display technologies, announced the availability of its Wazabee 3D glasses-free Gaming solution. The Wazabee bundle consists of two key components; a 19-inch multi-user display and a 3D gaming driver.

Customers who purchase the bundle can now play and experience their favorite games in 3D without needing glasses. There are currently 20 game and platform titles supported by Wazabee including Need for Speed, Call of Duty, World of Warcraft and Second Life with the bundle retailing for 740 Euros (or US$1149) before tax.

Source: PR Newswire

OpenCube to Showcase its New MXF P2 Workflow Solution at IBC 2008

OpenCube Technologies will be showcasing its comprehensive, high-performance MXF P2 solution at the IBC Convention in Amsterdam, September 12-16, 2008. The new application enables users to rapidly ingest, view, shot-log and consolidate MXF P2 files, simplifying and speeding up the process from start to finish.

Tapeless productions are on their way to becoming common currency throughout the broadcast industry, with more and more companies adopting the MXF format and, more particularly, the MXF P2 format for News Camcorders. OpenCube offers an unusually wide range of MXF solutions permitting a hassle-free and cost-effective transition to MXF workflows.

Among other solutions, OpenCube will present:

PS2oft HD v2.0 provides a seamless gateway to Newsrooms and production facilities. The new P2Soft HD release enables users to automate and streamline the ingest process and includes numerous cutting-edge features, notably format conversion for efficient connections with production and editing tools. P2Soft HD v2.0 is an easy-to-use and effective solution, with powerful tools for shot-logging and selecting useful media, as well as for annotating the content with metadata as soon as it is acquired.

OpenCubeSD v2.1 gateways are MXF native and, more specifically, MXF P2 OpAtom-compatible, permitting a tape back record for temporary storage and direct MXF tape archive ingest.

XFConverter v1.1 is the ideal glue for converting MXF formats into other interoperable wrapping formats, such as Quicktime and Op1a MXF, which are available in most of the solutions on the market today.

These MXF solutions will be showcased in a fully integrated workflow using the OpenCube solutions and P2 equipment. Our Team will be happy to demonstrate the value of the solution in terms of efficiency and ROI.

P2Soft HD is currently being deployed in a number of installations. OpenCube solutions are increasingly in demand because of their high level of interoperability and rapid time-to-market. OpenCube is an innovative company that invests heavily in R&D and leverages its MXF knowhow in order to offer customers comprehensive world-class solutions for boosting their “file-based workflow”.

Group Peers into 3-D Future for Blu-Ray

The Blu-Ray Disc Association is developing its position on stereoscopic 3-D under growing pressure from Hollywood studios who want to create a home video market for their rising number of stereo 3-D movies.

"There are discussions going on right now, and we are putting together a public statement," said Andy Parsons, chairman of the association's marketing group.

At least four ad hoc industry groups formed just this year to explore standards for stereo 3-D on TV. The Society of Motion Picture and Television Engineers held a meeting this week to form a task force to explore a 3-D content mastering standard. The Consumer Electronics Association will hold a meeting in October to determine if it should try to set standards potentially covering, TVs, set-top box and disk players.

Since 2007, studios have released or put on the drawing board as many as 80 stereo 3-D movie titles. At the Intel Developer Forum Wednesday (Aug. 20), Dreamworks co-founder Jeffrey Katzenberg said all his studio's animated movies starting next year will be created and available in stereo 3-D, a shift he said was as significant as the transitions to talkies and color.

Theoretically, the Blu-Ray group could take one of two broad approaches to stereo 3-D, said Parsons. It could decide to just pass through to HDMI ports any 3-D data on a disk letting the TV render it, or it could render the 3-D information locally which would require a significant addition to the Blu-Ray specification. If the group opts for the later approach it will need to define a standard format. In either case, the group wants to make sure any 3-D approach is compatible with its existing specification for 2-D content, Parsons said.

Many see Blu-Ray as the likely first vehicle to deliver stereo 3-D movies to the home. That's because the separate images for right and left eyes in stereo 3-D typically require significantly more bandwidth than 2-D images, creating trouble for broadcast delivery.

"The first real stereo 3-D for the home will be via Blu-Ray and for that you need a standard format," said a senior executive at one large consumer electronics company who asked not to be named.

"If everything goes perfectly this could happen in 2010 or 2011, but it never goes like that," the executive added. "Hopefully there will not be a format war."

Blu-Ray players and titles are just beginning to ramp up in the wake of Toshiba's decision earlier this year to abandon the rival HD-DVD format. About six million Blu-Ray players have shipped into the U.S. to date, most of them embedded in Sony Playstation3 consoles.

About 900 Blu-Ray titles are now available, more than double the number out just six months ago, Parsons said.

By Rick Merritt, EE Times

Make 3D Images with Your Own Camera

Photosynth, described by Microsoft as "the next step in the evolution of digital photography," analyses hundreds of pictures and stitches them together to create a composite image with super-high resolution. The software, released recently to the public after nearly two years of trials, lets anyone with a computer create a 3D image known as a synth using their own pictures or those from photo-sharing websites such as Flickr.

"Photosynth analyses the photographs you send it and creates a 3D world in which to present those photographs," Paul Foster, of Microsoft, said.

Unlike other photo-stitching tools, Photosynth requires no human guidance. The software package works out how the photographs fit together by recognising features within them.

"It's got some very clever image analysis algorithms in it, but basically it's looking for shape, for patterns, for texture," Mr Foster said. "It identifies those key points and then maps them across the photos to create the 3D model."

Click to watch the video

Anyone wanting to use the software must first download a free application from This enables a home PC to create the 3D image and allows web browsers to view the results. Navigation tools let users view the images from a range of angles, zooming into a detail or pulling back for a wide-angle view.

The viewer uses technology from a company called Seadragon, which Microsoft bought in February 2006. This enables a web browser to display small parts of huge images without downloading the whole picture, speeding up data transfer and enabling smoother scrolling and zooming.

The playful nature of Photosynth marks a departure for Microsoft, whose corporate image is stuffier and more conservative than that of Google. The search giant has held onto much of its quirky public persona despite its huge market share and vast cash reserves, in part because of fun, engaging applications such as Google Earth and Street View.

Mr Foster parried comparisons with Google by suggesting that Photosynth was more in keeping with the community principles of web 2.0. "It's all about the community," he said. "This isn't a company that's generating stuff centrally. It's up to people to capture what they want to capture."

Photosynth looks unlikely to be an immediate money-spinner for Microsoft – the website will carry no advertising nor other commercial content – but the company does expect to find practical uses for the technology.

The London Eye has created a 3D interactive tour of the attraction, while television has hinted at darker applications. An episode of CSI New York shown in the US earlier this year used Photosynth to reconstruct a murder scene so that detectives could analyse it in detail days and weeks after the crime was committed.

Top Ten Tips for Creating Super Synths
Microsoft has put together the following advice for people working with Photosynth, based on the experiences of people who have used the trial version. Sample synths created during the trial phase can be seen on the Photosynth website.

Take lots of photos
Not all photos will connect well and picking the right ones is a matter of experience. To give you plenty of choice, take two to three times more photos than you think you‘ll need, up to a maximum of about 300, and experiment until you find the right combination.

Have lots of overlap
You should try for at least 50 per cent overlap on average between photos. This overlap makes the 3D construction possible.

Panorama first, then move around
Start by taking a panorama of your scene, then move around and take more photos from different angles and positions.

Remember the “rule of three”
Each part of the scene you‘re shooting should appear in three separate photos, taken from different locations.

Limit the angles between photos
Try to get one photo every 25 degrees or so. That will make the synth work better. Extreme angle differences on a subject won‘t match up.

Shoot scenes with lots of detail and texture
The features in the photos are what tie them together. A blank wall won‘t synth, but one with lots of art or posters will work well.

Don’t crop images
Cropping eliminates important information that Photosynth needs, or makes the focal length inaccurate.

Shoot wide shots
Wide-angle shots reconstruct more reliably than closer shots. It‘s good to have close-ups, too, but you‘ll want to have good coverage of your subject with lots of nice overlapping wide shots.

Limit post processingYou can adjust anything that won‘t drastically alter the photos (brightness, contrast, red-eye and so on). Other than that, leave it alone.

Make sure your photos are all the right way up before you start synthing.

By Holden Frith, Times Online

Lifelike Animation Heralds New Era for Computer Games

Extraordinarily lifelike characters are to begin appearing in films and computer games thanks to a new type of animation technology developed by Image Metrics.

Click to watch the video

Emily - the woman in the above animation - was produced using a new modelling technology that enables the most minute details of a facial expression to be captured and recreated. She is considered to be one of the first animations to have overleapt a long-standing barrier known as "uncanny valley" - which refers to the perception that animation looks less realistic as it approaches human likeness.

Emily is a truly monumental achievement, recreating every nuance of human facial expression, even though what you’re actually looking at is the face of a digital actor. Created through a partnership with USC’s Institute for Creative Technologies (ICT), the team’s primary objective was to create a completely convincing, animated computer-generated face.

Using ICT’s special scanning system that can capture facial details down to the individual pore, the face of actress Emily O’Brien was transformed into a digital representation of herself, which could then be entirely machine-manipulated. A special spherical lighting rig captured O’Brien in 35 reference facial poses using a pair of high resolution digital cameras. The facial maps were then converted into 3D data using Image Metrics’ proprietary markerless motion capture technology.

"Ninety per cent of the work is convincing people that the eyes are real," Mike Starkenburg, chief operating officer of Image Metrics, said. "The subtlety of the timing of eye movements is a big one. People also have a natural asymmetry - for instance, in the muscles in the side of their face. Those types of imperfections aren't that significant but they are what makes people look real".

Previous methods for animating faces have involved putting dots on a face and observing the way the dots move, but Image Metrics analyses facial movements at the level of individual pixels in a video, meaning that the subtlest variations - such as the way the skin creases around the eyes, can be tracked.

"There's always been control systems for different facial movements, but say in the past you had a dial for controlling whether an eye was open or closed, and in one frame you set the eye at 3/4 open, the next 1/2 open etc. This is like achieving that degree of control with much finer movements. For instance, you could be controlling the movement in the top 3-4mm of the right side of the smile," Mr Starkenburg said.

For many years now, animators have come up against a barrier known as "uncanny valley", which refers to how, as a computer-generated face approaches human likeness, it begins take on a corpse-like appearance similar to that in some horror films. As a result, computer game animators have purposely simplified their creations so that the players realise immediately that the figures are not real.

"There came a point where animators were trying to create a face and there was a theory of diminishing returns," said Raja Koduri, chief technlology officer in graphics at AMD, the chip-maker. AMD last week released a new chip with a billion transistors that will be able to show off creations such as Emily by allowing a much greater number of computations per second. "If you're trying to process the graphics in a photo-realistic animation, in real-time, there's a lot of computation involved," said Mr Koduri. He said that AMD's new chip - the Radeon HD 4870 X2 - was able to process 2.4 teraflops of information per second, meaning it had a capability similar to a computer that - only 12 years ago - would have filled a room. AMD's chip fits inside a standard PC. But he said that the line between what was real and what was rendered would not be blurred completely until 2020.

There have been several advances in computer-generated imagery (CGI) in recent years. One project at the University of Southern California involves placing an actor inside a giant metallic orb which fires more than 3,000 lights from a range of different angles - and with different degrees of intensity - at the actor while he or she is are being filmed performing an action. The image captured by the camera can then be transported into another piece of film and the lighting effect (on the actor) chosen according to the ambient lighting in the scene.

Click to watch the video

Sources: Times Online (Jonathan Richards) and Technabob (Paul Strauss) via Marketsaw (Jim Dorey)

The Enterprise Solution for QuickTime-MXF Workflows

DAVID Systems' AtomFactory harmonizes the editing process of Apple's Final Cut Pro with the MXF-driven workflows of professional IT-based broadcast environments by converting MXF content on-the-fly to QuickTime and vice versa. Useful far beyond format conversion, however, AtomFactory automatically controls background functions that minimize bandwidth consumption, centralize media production, and accelerate workflows to make producing compelling content on Final Cut Pro within tight deadlines seamless and efficient. New in DAVID Systems' product portfolio is an integrated DigaSystem/AtomFactory bundle to round up exchange and integration functionality in a DigaSystem environment. DigaSystem is DAVID Systems' multimedia production management platform used by the largest broadcast organizations with more than 15000 seats installed worldwide.

A subset of Hydrogen Media Accelerator (HMA) designed specifically for Final Cut Pro, AtomFactory not only provides edit-while-ingest and playout-while-fusing capability but solves many integration problems. Even in production environments with twenty or more NLEs and additional central processes like ingest from a video server, AtomFactory is the perfect solution. The HMA central services HMA Ingest, HMA Send, and HMA Fusion are employed to share tasks and to intelligently manage resources such as bandwidth management and workstation availability.

HMA Ingest, installed centrally on a server, allows every member of the workgroup to ingest incoming material without having an additional application installed. In that case, only HMA Send and the QuickTime to MXF conversion are installed on every workstation while MXF to QuickTime is operated externally. HMA Send optimizes the traffic between client and server to release the server.

In combination with HMA Fusion, it is even possible to offload the MXF generation to a central system. In such a scenario only the modified parts of the project are sent from Final Cut Pro to this server which is ideal for those broadcasters who receive material in MXF but prefer production in the native QuickTime format. Additionally, many daily tasks can be automated through AtomFactory's scripting functionality. Whether exporting to third-party systems, archiving, uploading via FTP, or generating backup copies, a few lines of code are enough to push a follow-up action automatically.

There are other helpful and elaborate features: Once the producer has edited his QuickTime project and is ready to export it, he often faces an impossible choice between a seemingly endless list of folders all with cryptic names. AtomFactory reduces the choice to a manageable number of folders with a clear structure with understandable titles. That way, a folder with a follow-up task, for example, can be clearly named as "Send to Playout." AtomFactory is also configurable with a sequence preset, thus avoiding working with incorrect sequence settings by mistake.

Especially designed for the use in DigaSystem environments are the new bundles to integrate HMA/AtomFactory into the DigaSystem. Supported by DigaTransfer System the bundles allow bidirectional file exchange between Final Cut Pro and DAVID Systems' content management system including the typical AtomFactory features like edit-while-ingest or playout-while-fusion.

Source: PresseBox

Intel May Drive 'PC-ification' of Stereo 3-D

Details of the plot are unclear, but the early reviews are in: Intel has taken a great "leap ahead" in extending its x86 jewel into consumer electronics. Perhaps its greatest (and least specific) new initiative is a move into stereoscopic 3-D. At the Intel Developer Forum here Wednesday (Aug. 20) Intel announced a deal with DreamWorks Animation to enhance its content for 3-D cinema, as well as a separate initiative to bring 3-D to TVs and other devices. The duo will brand content from their work as Intru3D.

That's about all the detail available from either company. (Katzenberg left the forum without granting EE Times an interview, and Intel isn't saying much more.) But that's enough detail to get some long time consumer watchers excited.

"Intel is early enough with stereo 3-D that they can still influence that path and may be able to carve out an intellectual property stake in the technology," said Richard Doherty, principal of technology consulting firm Envisioneering (Seaford, N.Y.).

Just what Intel is doing — or not doing yet — is perhaps less important than the fact the PC processor giant has woken up to the big trend ahead. Hollywood studios, consumer electronics giants and technology providers are gearing up to drive the emerging phenomena of 3-D cinema into home products as the next big thing beyond today's big flat-screen HDTV. The trend has been years in the making, so far largely without the help of the PC industry.

"Intel is finally becoming a player here and Dreamworks will give them credibility," Doherty said. "Intel has the capability to bring real insights into the needs of stereo 3-D in silicon and tools," he added.

"Three-D for the cinema is here," said Eric Kim, general manager of Intel's digital home group in a press Q&A at IDF. "There are a lot of interesting technologies to bring stereo 3-D to the TV, but it is still in an early stage. "Once the TV can handle today's polarized [3-D glasses], the technology will take off in a big way," added Kim, referring to the inexpensive sunglasses used by companies such as Real D Cinema to bring 3-D to as many as 1,100 movie screens today.

Today there is no inexpensive way to bring stereo 3-D to the LCD displays used on mainstream HDTVs. That's because LCDs typically have slow switching time, a problem for stereo 3-D that often uses ways of pulsing quickly between left and right eye images.

Intel will work to bring stereo 3-D to TVs, PCs and mobile devices, said Renee James, an Intel software executive. However, she gave no details on exactly what role Intel will play in an area already crowded with providers and many more expected.

Real D, which has a commanding lead in the cinema, hired Koji Hase in January to head its move into 3-D TV. Hase won a technical Emmy for his work helping to create the DVD format. Real D is not discussing specifics of its plans for a consumer offering yet.

Dolby Labs also has a 3-D cinema technology getting deployed in theaters, but it would not comment on any plans for a 3-D approach. A handful of smaller companies including TDVision Systems and Sensio Technologies (Montreal) are pursuing this area with announced products.

Consumer giants such as Panasonic and others say they have work in their labs and consider 3-D TV strategic, but they are not willing to comment on the record.

Doherty said Intel could help with the transition to stereo 3-D by providing production tools, something lacking today. "The 3-D tools used today have largely been created internally inside the studios," Doherty said. "Eliminating the cost of developing these tools will really help.

"The PC-ification of stereo 3-D could be a major catalyst for this transition" to 3-D TV, said Doherty. "This is the biggest news to come out of IDF."

Now that Intel has rolled out its plans, expect Nvidia and Advanced Micro Devices will follow.

By Rick Merritt, EE Times

Intel, Dreamworks Create Tools for Stereo 3-D

Intel Corp. is partnering with DreamWorks to create enhancements for stereoscopic 3-D in the cinema. The partnership is an aspect of a broader initiative at Intel to bring stereo 3-D to a wide variety of devices, including TVs. The move comes at a time when a broad array of Hollywood studios, consumer electronics companies and technology providers are converging to define 3-D TV.

DreamWorks co-founder Jeffrey Katzenberg came to the Intel Develop Conference to announce the partnership, calling stereo 3-D a third generation in entertainment following the shift to talkies and color. All DreamWorks' animated features will be created and made available in stereo 3-D starting in 2009, said Katzenberg who showed the 2,500 attendees here clips from its next 3-D title Monsters versus Aliens.

Katzenberg said major Hollywood directors, including Stephan Spielberg, James Cameron and Peter Jackson, are working on live-action movies in stereo 3-D. "In the next few years they will make some of the best films using these new techniques," he said.

Intel is developing a variety of tools to help deliver 3-D on multiple devices, said Renee James, an Intel software executive. The work with DreamWorks on 3-D cinema enhancements will be branded Intru3D.

The move was Intel's second big step into the CE market this week. The company also announced a new system-on-chip and software partnership with Yahoo! to bring the Internet to TV.

By Rick Merritt, EE Times

FOR-A Demonstrates MediaConcierge Content Management System

FOR-A will demonstrate MediaConcierge, a comprehensive content management system for use by broadcast television stations and video production houses worldwide. The first two MediaConcierge products in the system, the MBP-100SX baseband converter and the MCS-100M MXF file conversion software, will make their european debut at IBC.

MediaConcierge manages content for broadcasters so that they can take full advantage of all business opportunities that exist for its media assets. It handles tasks such as media ingest and format conversion, allowing each media asset to be repurposed for delivery via a variety of outlets, including broadcast, the Internet, and packaged media. The system converts between common file and digital video formats; facilitates search, browse, and file delivery; and interfaces to third party media content databases and archives.

The MBP-100SX converter converts XDCAM HD MXF files to baseband video and HD-SDI signals, and vice versa, in realtime for applications requiring baseband video. This converter uses a new MBP video card developed by FOR-A that supports XDCAM HD, DVCPRO HD and AVC-Intra tapeless media files. FOR-A will develop additional converters, including one for H.264 and another for AVC-Intra. Both will utilize an MBP board and will be part of the MediaConcierge product family, depending upon future market needs.

During the shooting, editing, and delivery process, there are many points where the file must be converted to baseband signals. Workflow is improved when converters receive video data as a file via the network, convert that data to baseband signals, and then enable the playback.

The MCS-100M MXF conversion software-which consists of three software plug-ins MCS-10PD (DVCPRO), MCS-10PA (AVC-Intra) and MCS-10SX (XDCAM)--enables the conversion of MXF (Media Exchange Format) data files between the P2 HD and XDCAM HD formats.

With MediaConcierge, once data has been managed, high-speed search based on metadata and proxy data can immediately locate stored scenes. The high compatibility of the MediaConcierge database structure means it can be linked with nearly all existing databases. Therefore, operators can create a unified management environment without replacing their existing system configuration.

The MBP-MXF baseband converter and P2-XDCAM format conversion software are now shipping.

Source: FOR-A