3-D Camera Rigs

Some 3-D rigs pictures taken by Werner Bloos during last Digital Cinematography exhibition. You can also download original side-by-side stereoscopic pictures.

3ality TS3






3ality TS1







P+S Technik





Ludwig Kameraverleih







Half-mirror 3-D Display


Source: Stereoforum

3D Television Without the Special Glasses

Philips, the electronics company, has created a 42-ins 3D TV set that gives viewers the impression of images leaping out at them. Experts believe the development will revolutionise the TV viewing experience because people will feel that they are not just watching the action but are actually in it.

"If you are a big fan of EastEnders, you will feel as if you’ve been invited into the homes of your favourite characters,” said Ben Nicholls, business development director at Picture Production Company (PPC), which is developing content for the new 3D TV set.

"You’ll feel as if you are actually in Pat Butcher’s kitchen, and if you are a big football fan, you’ll feel as if you are watching the game with the rest of the supporters. It's a totally immersive experience that makes people feel incredible involved in what's going on."

Philips is already selling the sets to the commercial advertising market for up to £20,000 each, but the company is confident that cheaper versions costing £1,200 will be available for consumers in three years' time when enough content is available to justify splashing out on the new technology. To see a three-dimensional image on a flat television screen, the viewer's right and left eyes need to see slightly different images, taken from different angles, to trick the brain into thinking it is seeing a 3D object.

For decades this has been achieved by viewers wearing special glasses which feed a slightly different image to the right and left eyes, using different coloured lenses. Philips has dispensed with the need for glasses by placing a lens on the screen with creates the same effect. The technology comes at a time when Hollywood studios are becoming increasingly interested in making films in 3D.

Animated 3D children's films, such as The Polar Express and Disney's Meet the Robinsons, have been hugely successful, and prominent directors, including James Cameron and Steven Spielberg, are currently working on 3D films scheduled for release next year. Mark Hurry, the director of legal and commercial at PPC, predicted that 3D sets would follow the success of high-definition television screens.

"The early adopters will be companies looking for innovative ways to attract passers-by, but it will only be a matter of time before the demand for 3D TV grows so much that consumers will want it in their homes," he said.

"Content owners and production companies should be thinking now about 3D as a new way of reviving their back catalogues."

By Nicole Martin, Telegraph

ProjectionDesign Launches Active 3D Stereoscopic Projector

The new F10 AS3D single-chip DLP projector from ProjectionDesign is the only portable high resolution active 3D stereoscopic projector on the market. Designed for the demanding visualisation and simulation environments, the F10 AS3D features patented technologies to achieve its remarkable performance. At InfoComm, the F10 AS3D will be displaying alongside ProjectionDesign's F20 sx+ 3D passive stereo projectors, showing a complete range of 3D stereoscopic projection solutions.

ProjectionDesign F10 AS3D

Source: BroadcastBuyer

Guillemot Confirms 3D Games Development at Ubisoft

Yves Guillemot, CEO of French publisher Ubisoft, has confirmed that his company is currently working on developing 3D games. Ubisoft is working on the game of James Cameron's 3D movie Avatar, with Guillemot stating that title will be released next summer.

"Technology continues to evolve and we have now new 3D TVs that are coming in homes and we will have also movies in 3D," said Guillemot, speaking at Ubidays in Paris. "Because we will have movies and TVs, we are actually working on games in 3D."

He added: "James Cameron has said he is working with us on Avatar in 3D, I am proud to tell you today we have seen it and it is wonderful. I am sure you will love this new experience because this new technology is just wonderful. You will see that, I think, in the middle of next year."

Cameron revealed Ubisoft was working on 3D titles earlier this week at a Microsoft event, following the publisher picking up the licence to the film in 2007.

By Matt Martin, GamesIndustry

3D Stereo for World Cup?

A successful technical trial of the IIHF hockey World Championship could lead to a stereoscopic broadcast of the 2010 Soccer World Cup, writes Adrian Pennington.

HBS, the Swiss-based production unit of sports marketing group Infront Sports and Media, covered the IIHF finals from Canada in HD and simultaneously conducted its first trial into 3D HD. Crucially, HBS is also host broadcaster for the 2010 World Cup in South Africa.

"3D is on our radar and we'd like it to be on FIFA's radar," said Peter Angell, director of the Production and Programming Division, HBS. "In our position as host broadcaster of the World Cup we are required to keep FIFA abreast of all new technologies and make it an option for them."

The IIHF trial was delivered in conjunction with The3DFirm, a triumvirate of British companies behind the production of the world's first satellite delivered multi-camera 3D sports event during the Scotland vs. England Six Nations Rugby International in March.

"One of the critical things about the way we tackled ice hockey was to start from the point of view of adopting a regular broadcast workflow and trying to make that work in 3D rather than taking a 3D approach and trying to make that work in a broadcast environment," said Angell. "If 3D outside broadcasts are to take off, production needs to be straightforward and cost effective using standard EVS machines and vision mixers which are available anywhere rather than bespoke camera rigs. What attracted us to The3DFirm was that their approach enabled us to use the same workflow as for 2D HDTV."

Specifically the trial deployed three static pairs of Thomson LDK 6000's around the rink - one central and high and two focussed on either goal - with a pair of Iconix HD-RH1's for low-level point of view at one of the goals.

The3DFirm transported its own rigs for control of ocular conversion. "We chose a point of convergence at the nearest bit of action - at the rink edge - and matched the lenses and camera height. It took just half an hour to align each rig which is very attractive to us to just slot into a regular broadcast environment," explained Angell.

The camera's were triax-cabled, gen-locked to a vision mixing truck and the feeds recorded to EVS XT[2] servers and then to HD recorders synced with time code. Feeds were previewed on a small monitor using polarized glasses in the truck. A small projection unit is a possible future addition.

"The doubling of everything is the main cost," reported Angell. "For every one camera position you need two cameras and two sync paths throughout the truck. Slow motion requires dual EVS channels. Half the amount of inputs are available on the vision mixer and router. Aside from that the resource requirements and workflow are the same.

"We are looking to design one or two broadcast conversion rigs capable of handling zooms that are built from standards like triax," reported Angell. "We're also exploring the possibility of using the mixing truck to manipulate the convergence electronically rather than at the camera head.

"From a TV production point of view this is an exciting an innovation as there has ever been," he said. "Any stakeholder in any major sports event will be looking at 3D over the next twelve months to see if there's a market and a business model which works."

The IIHF 3D trial was not transmitted but will serve as a showreel for HBS's 3D capabilities.

"We strongly believe that groundbreaking production technologies like 3D HDTV are a key differentiator in the fast changing media and marketing environment," said Infront President and CEO, Philippe Blatter, who confirmed the group's investment in 3D.

Bruno Marty, Infront's executive director winter sport, added: "3D rights will become increasingly valuable to sports rights-holders and the 3D experience also creates entirely new communication and promotion platforms for sponsors. The potential for sport is huge."

Source: TVB Europe

VisuMotion supports Philips' WOWvx format to Leverage 3D Visualisations

Germany-based VisuMotion GmbH has signed licence agreements in the past week with the Netherlands-based electronics corporation Philips. Under the agreements, VisuMotion has the opportunity to use some of Philips worldwide patents and technologies regarding processing of 3D content based on 2D-plus-Depth information, such as implemented in the WOWvx format of Philips. Thus, remarkable opportunities are opened up for VisuMotion for future software sales making use of the licensed technologies.

Besides VisuMotion is now preparing the next generation of its software products that will support the Philips defined WOWvx file format for 3D pictures and videos which is based on 2D-plus-Depth information. Thus, the most important products of the Jena based company will drive Philips' glasses-free 3D screens. The list of manufacturers of 3D-displays that can be powered by VisuMotion's software and 3D contents is hence enhanced to include 12 worldwide leading manufacturers.

The use of glasses-free 3D displays is rapidly increasing worldwide, particularly for digital signage applications, 3D gaming, research and development as well as for medicine and Virtual Reality setups. Independent market researchers do foresee a worldwide turnover of 3D products of several 100,000,000 Euros as early as in 2010.

VisuMotion's product portfolio includes a multi-view 3D camera, 3D Rendering Plug-ins for Autodesk's 3D Studio Max and Maya, the Compositing and Editing Software "3D StreamLab", the 3D Application Driver "DeepOutside3D" supporting inter alia various 3D-games as well as the 3D video playback software "3D Movie Center".

Source: Stereoscopy

eMagin Z800 3DVisor

eMagin Corporation is using the Society for Information Display Conference and Exhibition to showcase its latest OLED microdisplays which feature in products such as the Z800 3DVisor. The visor uses two power-efficient OLED microdisplays to provide wearers with the 3D equivalent of a 105-inch display viewed at 12 feet’s distance. It seems the longer this product is out there, the more potential applications surface - military training, graphic design, architectural modeling and ... cranking up the volume on Half-Life 2 and whipping yourself into a fully immersive zombie killing frenzy.


Drawing its power entirely from a USB connection, the US$1499 Z800 3DVisor integrates the SVGA 3D OLED microdisplays with stereo audio, a noise canceling microphone, and a high-speed headtracker that enables full 360-degree virtual-surround viewing. The microdisplays use the same technology as the OLED displays in the US Army Land Warrior program, and can deliver high-speed, 800x600 triad pixels resolution, and high-color (>16.7 million). The displays can recognize and deliver left-eye and right-eye stereovision signals, buffeting color data under each pixel site to eliminate flicker and smear. The visor is compatible with PCs that can produce an analog SVGA resolution (800x600) with a refresh rate of 60 Hz - the 3DVisor cannot yet support higher resolutions or refresh rates, but it can still deliver visual quality if only 2D information is available.

The Z800 3DVisor web site has an extensive list of computer games that have stereovision 3D compatibility, including Unreal Tournament 2003 and 2004, Doom 3, Painkiller, Half-Life 2 and America’s Army. When in stereovision 3D mode, the 60 Hz signal alternates between each display, providing a flicker-free refresh rate of 30 Hz per eye. The headtracker uses micro-electro-mechanical system accelerometers and gyroscopes to detect motion, allowing the movement of the users head to control mouse movement onscreen, giving users a fully "in-game" experience.

OLED technology has vastly enhanced the capabilities of VR platforms, for both gaming and training purposes. Unlike LCD screens, which require backlights, the pixels in Organic LEDs directly emit light, conserving space and power. It was this advantage that led to the widespread incorporation of OLED screens into MP3 players, phones and cameras, which were under pressure to provide increasingly detailed screens in ever-smaller units. However, the acceleration of OLED development has allowed it to grow far beyond its roots in handheld electronics. The higher contrast and wider dispersion of light provided by OLED displays makes it ideal for near-eye use, in products like the Z800 3DVisor.

Source: Gizmag

GDC Technology Launches New EN-2000 DSR Digital Film Agile Encoder

Building on the success of its highly popular EN-1000 DSR Digital Film Agile Encoder, GDC Technology is pleased to announce the launch of its next-generation encoder, the EN-2000 DSR Digital Film Agile Encoder.

The EN-2000 DSR Digital Film Agile Encoder retains all the features of its predecessor, the EN-1000. In addition, the EN-2000 has the following new features:
- 2K DCI JPEG2000 compression – support for real-time or faster
- 4K DCI JPEG2000 compression
- Input sources expanded to include TIFF, DPX, and Targa files
- Image processing options such as cropping and scaling
- Color conversion using 3D lookup table
- Automatic color conversion to DCI X’Y’Z’ color space
- Support for SMPTE-compliant DCI packages including the stereoscopic package

Source: DCinemaToday

cineSync Pro Released

Today sees the release of cineSync Pro - the ultimate synchronised review and approval solution.

Expanding on the hugely successful and widely adopted cineSync, cineSync Pro takes interactive, real time reviews to another level and is fast becoming an essential part of production and post production pipelines the world over.

Available now, cineSync Pro has been developed in response to the needs of VFX, animation, film production, network television and DI facilities - facilities that deal with a large volume of media across multiple locations and which require a level of interactivity not available in existing media review tools. Available on Windows, OSX and Linux, cineSync Pro is also designed to be deployed as an internal review tool, streamlining project management workflows.

cineSync Pro is designed to provide absolute "visual context", removing all potential for confusion or misinterpretation, by ensuring that everyone in the review and approval process sees exactly the same thing at the same time.

Features of cineSync Pro include the ability to play stereoscopic material, to apply 3D colour look-up tables (LUTs) to loaded movies and still images and to display in full screen on a monitor or projector. With cineSync Pro you are now able to adjust colour, brightness, gamma, saturation and contrast, zoom and pan to specific ares of interest, apply hard and soft masks and modify stereo convergence, all in total synchronisation with everyone else in the review - no matter where they are in the world.

Source: cineSync

PX-100 Media Server for Sony XDCAM EX

Engineered specifically for Sony PMW-EX1 users, the PX-100 Media Server for Sony XDCAM EX from Focus Enhancements makes media asset management simple and cost-effective. A fully integrated solution, the PX-100 Media Server combines PX Media Asset Management software with professional-grade server hardware to provide an affordable turnkey media asset management system that can easily grow as your needs evolve.


With the PX-100 Media Server, you can integrate your entire workflow, including acquisition, storage, and distribution, to gain unparalleled efficiencies. Designed for fast performance, the PX-100 Media Server uses rich metadata to make it easy to find and retrieve footage. The browser-based interface provides instant access to your files from any location.

Orange Plans 3D Tennis

This year for the very first time at Roland Garros, Orange is going to film and broadcast live its first 3D sports contents for it guests. The trial will take place on Monday 26 and Tuesday 27 May 2008: Orange will use 3D cameras to film all the matches on the Suzanne Lenglen court and broadcast them live. These 3D matches will also be available on VOD until the end of the tournament. At Roland Garros and in its two flagship stores located at Champs Elysées and Paris Madeleine, Orange will be providing its guests with 3D glasses to watch the matches on 3D Television.

Polarized glasses needed to watch Orange's 3-D demo

Orange is thus preparing the arrival of 3D television in our customers’ homes. In collaboration with the French Tennis Federation and France Télévisions, this Orange premiere showcases this technological innovation which is sure to take sporting emotions to new heights.

Iconix-based 3-D rig

Note: 3-D captation is managed by NHK Media Technology (formely known as NHK Technical Services) which provided three 3-D camera rigs and six engineers to Orange.

Source: Orange

Light Blue Optics Demonstrates Miniature Projection Systems and New Use-Cases at SID 08

Light Blue Optics (LBO) demonstrated its latest miniature projection systems at the Society for Information Display. These highly efficient, low-cost miniature projection systems are based on LBO’s proprietary holographic laser projection technology and have applications across multiple high-volume markets, including consumer electronics and automotive. The Company will make evaluation units available to key customers and strategic development partners from June 2008 as part of its fast-paced product development programme.

At SID 08, LBO demonstrated bright, efficient miniature projection systems with a range of differentiating features that can be tailored to suit specific customer applications. The systems deliver superior image quality with in-built speckle reduction, variable resolution (WVGA to QVGA) and focus-free operation, enabling multiple use-cases to be realised by the same device. Due to the unique properties of holographic laser projection, LBO’s systems also offer the option of enhanced image brightness for use in higher ambient lighting conditions.

Source: Light Blue Optics

XDC Signs Digital Cinema Deployment Agreements with Warner Bros, Paramount, Twentieth Century Fox and Disney

Warner Bros. Entertainment Inc. (“Warners”), Paramount Pictures Corporation (“Paramount”), Twentieth Century Fox Film Corporation (“Fox”) and The Walt Disney Studios (“Disney”) will support XDC, acting as a Deploying Entity, in order to roll out and fund digital systems for theatrical presentations in several countries in Europe.

Under the terms of the agreements, Warners, Paramount, Fox and Disney have independently agreed to co-finance the future deployment of maximum 8,000 DCI-compliant digital cinema installations in 22 European countries. The roll-out period under the agreement shall last for a maximum of 5 years while each digitized screen shall be co-financed over a period of maximum 10 years. Those agreements are co-financing more than 65% of the value of digital exhibition systems made of projectors, servers, applications and services, for a maximum estimated global investment of EUR 600 million.

These agreements mark and ease the beginning of the large scale deployment of digital cinema in Europe. For XDC, the next steps are the negotiation of comparable agreements with European movie distributors, the sale of this co-financing proposal to cinema exhibitors across Europe and at last, both equity and debt raising to fund the digital roll-out phase.

This infrastructure deployment will also help XDC to develop its other activities: first, the design and sale of cinema servers and software applications, secondly, the installation and maintenance of complete digital cinema systems for exhibitors, and thirdly, digital content processing and distribution services for movie distributors and advertising sales houses.

The agreements with Universal Pictures and Sony Pictures are in a very advanced stage and will be closed shortly. There is a global potential of 35.000 screens to digitize across Europe.

Source: FOX Business

ProjectionDesign to Launch its 2/3D Projector

ProjectionDesign announced that it will launch its 3D projector supporting both 2D and 3D in Korea market in September.


The GP1 Active Stereo 3D projector needs to be used with active shutter glasses. The design is based on Digital IMAGE's Cube 3D with the colorwheel running at 2 times the frame rate. The Cube 3D is modified with an extra F1+ electronics board to make a single-chip active stereo projector. Each revolution of the colorwheel is assigned to one eye. The result is a reduction in color resolution and slightly distored grayscale. This distortion is corrected through a look-up table.


Even though the colorwheel spins at 2x, the two separate DLP formatters each controls one revolution of the colorwheel, processing their separate left and right datastreams. The control of the DMD is switched 120 times per second to display a true full resolution 3D stereo picture.



Source: AVING

Utah Scientific UTAH-400 iP

During the last few years, there has been a continual increase in the use of IT-based equipment in the broadcast industry. More and more broadcast devices for storage, manipulation and transport are now actually computer file-based systems. The industry transformation to what is essentially a hybrid broadcast/IT infrastructure has presented some challenges. Most notable is the issue of how to realize real-time switching on an IT network that isn't necessarily real time.

In simple terms, the signal-specific routing equipment that has been used in the broadcast industry basically made a direct connection to a piece of equipment at point A with another piece of equipment at point B (and C to D, E to F, etc.) without any worry of interference from anything else going on in the system. Now, with the use of the computer file-based equipment and Ethernet networks for the transport of digital video, files are transferred between devices using normal IP file transfer techniques, and all devices share the same pipe. Because of the extremely large size of these files, the time required for file transfers can be long and can be affected by the presence of other data traffic that is present on the network. The challenge is: How do you get the guaranteed bandwidth on the network when you absolutely need it?

To solve the problem, Utah Scientific developed the UTAH-400 iP network switch, which allows for a managed Ethernet network with dynamic allocation of bandwidth, QoS and VLANs. The unit is designed to return real-time control of the switching fabric back to the broadcast engineer.

Utah Scientific UTAH-400 iP


The problem with networks
An IP network is often viewed as an inflexible, unmanageable configuration that results in performance that is fixed and has to be “lived with” as it is. Of course, there are methods to control the behavior of the network, but periodic disruptions, such as one user on the network consuming a large amount of bandwidth that results in loss of or impaired use for other users, are considered normal occurrences.

Built into the hardware and software of industry standard IP switch products are various methods to control the behavior of the network. These are used in varying degrees in different installations to do things like provide for priority of certain types of data traffic (video or voice, for example), isolate critical network segments from noncritical ones, and limit available bandwidth for noncritical functions. These methods include the IP type of service (TOS) or DiffServ features to allocate traffic priority, VLAN segmentation to isolate network segments, and ingress/egress queue management to control bandwidth.

To date, these methods can be used at the system configuration level. In other words, once they are set up at the design and installation phase of the network, they are typically never altered, unless a portion of the network changes, and the video engineer in charge of day-to-day operation has little control of the system. To make changes, a system administrator would need to modify the setup parameters, and the system then continues on as before. It is a functional approach, but it is not dynamic. It relies on specialized IT personnel with extensive training to effect changes to the network. Some control parameters, such as IP TOS, can be made available, but they are not normally used because of the unwieldy nature of the management.

A new management method
The UTAH-400 iP network switch can control an IP network in real time, allowing for much more flexibility in the use of a new or existing IP network. Users do not have to live with limitations designed into their network. Traffic prioritization becomes something that users are allowed to control. Network segmentation can be changed for daily maintenance or backup functions on an automated or manual basis.

Perhaps most importantly, critical functions can be granted priority immediately, without intervention from an IT professional. And, the administrator can grant how much control each user has to assure integrity of the network.

With the network switch, network management can be implemented in a variety of forms, which have all been proven in real applications over the last 30 years. These range from industrial control panels to automation applications to inband, transfer-by-transfer controls.

Making IT networks work for broadcast
IP TOS identifiers that manage traffic priorities in a network have been in place since the inception of the TCP/IP protocol. These methods work well for prioritizing one type of traffic over another, but not at determining priority between two or more streams of the same type of traffic. So, for example, if two editing workstations are simultaneously trying to move video files to a server for play-out on the evening news, there is no way to distinguish between and give priority to the clip that has to go to air in 60 seconds and the one that's not needed until after the first commercial break.

The UTAH-400 iP allows users to prioritize traffic coming from any device to any one of eight tiers that they desire, in real time. This approach works well for situations where multiple data streams are competing for a given port's bandwidth. Even when a network is designed with a large backbone bandwidth, connections to individual devices or between network segments will have a finite bandwidth that can be overrun if it is not managed. With dynamic TOS management, the data stream with priority can be guaranteed to arrive at the destination port no matter what other devices on the network are attempting to do — so the clip that needs to go to air in 60 seconds is given network priority with a simple press of a button. And the priority can be changed at a moments notice based on user needs and the changing dynamic of the broadcast day, or remain in a predefined configuration.


Another management method available on the typical IT network is the control of VLANs. These are logically separated paths within the same physical network infrastructure. They allow users to segment a part of their network away from other parts. Dynamic control of VLANs with the UTAH-400 iP can allow users to segregate a network however they wish, and to change this as different needs arise. This management method is essentially the same concept as a traditional broadcast XY router in application.

Bandwidth, the physical payload of any given network port, is typically represented as the speed of the interconnect, i.e., 10Mb/s, 100Mb/s or 1000Mb/s. While this accurately defines the maximum speed of the interconnect, the actual bandwidth can be modified by limiting the data capable of being passed over a specific port. Queues within the input and output sections of each Ethernet port on a switch can be dynamically bandwidth-limited at the discretion of a user. This allows a user to restrict data from a specific port that is involved in a noncritical activity, and then increase it to full bandwidth moments later when it is involved in a critical task. This brings another level of flexibility in the behavior of the network to the end user.

Conclusion
Monitoring and managing IT networks through the use of the UTAH-400 iP network switch brings a new flexibility to Ethernet networks that makes them much more functional and adaptable to the dynamic reality of the broadcast environment. The switch allows for time-critical reconfiguration of any preplanned network without compromising the functionality that users are used to now. The result is an environment where users can identify and correct bottlenecks before they cause problems.

By Jeff Levie, BroadcastEngineering

Sony Pictures to Launch Digital Cinema Unit

Sony Pictures studio unveiled plans on Wednesday for a new digital cinema unit to bring filmed presentations of Broadway shows, rock concerts and sports events to specially equipped movie theaters nationwide. The new venture, dubbed the Hot Ticket, will launch in August with a presentation of the final staging of the music and dance extravaganza Delirium from Cirque du Soleil, which closed its worldwide tour in London in April. In September, the final performance in the 12-year Broadway run of the hit musical Rent will be presented.

"Our mandate will be to identify the one-of-a-kind, and sold-out events that people around the country most want to see ... and present them to audiences everywhere," Sony distribution president Rory Bruer said in a statement. Further out, he plans to explore numerous options. "I definitely think there is one project that we are looking at that would be 3-D," he said. "I presume at some point, live events." Tapping into other Sony units is also a possibility.

Hot Ticket presentations will be shown in high-definition format for limited engagements, starting out on roughly 400 to 500 screens in theaters across the country, with audiences paying roughly $20 a seat, Bruer said. The new Sony business is capitalizing on a sweeping upgrade of movie houses being undertaken by major theater chains and studios to bring state-of-the-art digital projection technology to thousands more screens in the United States and Canada.

Besides lowering distribution costs for studios, digital technology is seen as paving the way for the introduction of more "alternative content" to theaters, helping exhibitors bolster sagging movie admissions, especially on weekdays. Sony Corp is hardly alone in bringing such non-movie entertainment to a theater near you, but it is believed to be the first major studio to create a separate unit devoted to such content.

"We're excited to be on the ground of floor of what is going to be a new business for movie theaters," Bruer told Reuters.

About 5,000 of nearly 39,000 U.S. cinema screens are already digitally equipped, and that number is expected to climb steadily, said John Fithian, president of the National Association of Theatre Owners. He called Sony's move a "confirmation that everyone in the movie industry envisions the cinema as a growing entertainment destination for a variety of products."

"It's significant that Sony recognizes the potential for alternative content in cinemas by creating a separate unit," he said.

The Sony initiative builds on a trend that has been evolving in the movie industry for some time. Concert films have long been popular offerings at the multiplex, and the Walt Disney Co. scored a box-office bonanza with its recent 3-D release Hannah Montana & Miley Cyrus: Best of Both Worlds Concert.

Landmark Theatres screened opera star Placido Domingo's 40th anniversary concert in 22 playhouses last month, and several theater chains have teamed up to show auto racing, soccer matches and even Tour de France competition. National Amusements, the controlling shareholder in Viacom Inc., has been screening high-def broadcasts of select Boston Red Sox baseball games in its Showcase Cinemas in New England since 2003.

By Steve Gorman, Reuters

Digitalfie, Digital Cinema Consortium, Plans for Digital Cinema Services

Digitalfie, a digital cinema industry consortium has announced plans to begin testing a series of new digital cinema services that the group says can help exhibitors increase sales by as much as a fourth.

The pilot project will be hosted by U.K. digital exhibitor Kino Cinemas, one of the members of the consortium, but it is open to other European exhibitors as well.

Other Digitalfie members include digital server manufacturer Doremi, Austrian software developer SiTec and German-based consultancy Peacefulfish.

Source: Digital Cinema Buyers Guide

MainConcept Technology Powers HD Video Editing Platform for Adobe Premiere Pro

MainConcept, a wholly owned subsidiary of DivX, and one of the world’s leading providers of video and audio codecs and software development kits (SDK) to the broadcast, film and consumer markets, today announced the release of the updated MainConcept MPEG Pro HD 3.2 Plug-In for Adobe Premiere Pro CS3, providing enhanced support for the editing of high-definition video.

MainConcept’s renowned MPEG Pro HD 3 Plug-In allows the quick and easy editing of MPEG, H.264/AVC, and even native HDV content in the familiar Adobe Premiere Pro workflow without requiring transcoding. The newest MPEG Pro HD 3.2 Plug-In brings frame-accurate, native MPEG editing with smart rendering to Premiere Pro CS3 versions. It includes full support for Adobe Premiere Pro CS3, Dolby Digital, AVC-HD and a wide variety of camcorders, including Sony’s new XDCAM HD 4:2:2 series optical disc and XDCAM EX series solid state camcorder.

The newest release of the MPEG Pro HD 3.2 Plug-In supports Sony XDCAM EX, XDCAM HD 4:2:2, and 4:2:0 series camcorders. It also adds professional features such as H.264/AVC import and export, enhanced HD editing performance, significant speed improvements for MPEG-2 encoding and H.264/AVC decoding, and improvements to Adobe's flagship video editing application.

Source: Mainconcept

New JPEG 2000 Conversion Software for Thomson Grass Valley Infinity Camera

CineForm has developed a new plug-in for a real-time post-production workflow for the Thomson Grass Valley Infinity Digital Media Camcorder that preserves the 10-bit precision of the camera’s recorded JPEG2000 files.

CineForm’s new software, called the JPEG2000 Play Module for Infinity, re-wraps MXF files generated by the Infinity camcorder to AVI or QuickTime files for preview playback on both Windows and Mac workstations. When combined with CineForm’s Neo HD or Prospect HD software, Infinity files can be converted in real-time to 10-bit CineForm Intermediate files, which are cross-platform compatible with many post-production tools, including Adobe Premiere Pro, Apple Final Cut Pro, and others.

The new JPEG2000 Play Module for Infinity can be used in production for shot review and in post for converting MXF-wrapped Infinity files into CineForm Intermediate AVI or MOV files. For shot review, Infinity files are copied from recordable media (Rev Pro, Compact Flash, or other) and are simultaneously re-wrapped into AVI or MOV files that maintain the J2K essence underneath. Double-clicking on the re-wrapped media will launch the default media player, such as Windows Media Player or QuickTime Player, allowing immediate review of the recorded files.

In post, Infinity files can be converted on the fly into 10-bit CineForm Intermediate files with either an AVI or MOV wrapper when CineForm’s Neo HD or Prospect HD software is also installed. Converted files are immediately available for editing within most industry-standard post-production tools. CineForm conversion algorithms maintain the 10-bit 1080i and 720p source formats recorded by Infinity.

Licenses for Windows and Mac versions of CineForm’s JPEG2000 Play Module for Infinity will cost $99 per seat. The Windows version will be available for purchase in May from CineForm’s website, and the Mac version will be available a few weeks afterward, according to the company.

By Michael Grotticelli, StudioDaily

DivX Launches Beta Phase of its Own H.264 Decoder

10 years after DivX revolutionized video on the Internet and seven years after the launch of Project Mayo ("OpenDivX"), which set the stage for DivX 4 and Xvid, DivX Networks is again mixing a new dip. The new project, called Rémoulade by DivX Labs, is developing an H.264 video decoder based on an implementation of MainConcept. DivX took over MainConcept, the German MPEG specialist, last November.

The DivX developers have announced the launch of their decoder's beta phase. Those who wish to take part in the beta test have to own, or create, a DivX account then send a Private Message to DivX. To simplify the procedure, the beta version will soon be made available via e-mail.

The decoder is expected to support all common H.264 profiles – Main, High, High 10 and High 4:2:2 – in addition to interlacing methods (MBAFF, PAFF, and mixed) and multithreading – up to 8 CPUs. DivX says the code was optimized for MMX, SSE and SSE2. Further details are available at the developer's website. A speed comparison with the universal ffdshow tryouts audio/video decoder and the commercial Core AVC – considered the fastest and most reliable H.264 decoder for Windows at the moment – is also provided. The stats for DivX H.264 Decoder Beta 1 show that this new decoder is almost 2 per cent faster and can be scaled better than CoreAVC. The test clips used were encoded with x264, but an H.264 encoder is expected to be released under the DivX label, probably also based on MainConcept.

DivX grew up with MPEG-4 Part 2 and is one of the main reasons why most DVD players can playback MPEG-4-videos. Fortunately for DivX, the Moving Picture Experts Group integrated H.264, a more efficient video compression method, as Part 10 (Advanced Video Coding) in the MPEG-4 video standard, so now the company will be attempting to get the DivX label used synonymously with MPEG-4 – regardless of whether people are talking about Part 2 or Part 10.

Source: Heise Media

TDVision Systems to Showcase Optimum HD-3D Stereoscopic Experience at SID 2008 Show

TDVision Systems will be presenting at SID 2008 (Society for Information Display), the convergence of stereoscopic 3D Samsung displays, the immersive TDVisor and the complete suite of TDVision applications including the stereoscopic virtual world, the AlterSpace, Dejaview, the 3D media player and a compelling demonstration of the 2D and MPEG compatible TDVCodec running on a Blu-ray disc.

The TDVCodec demonstration will demonstrate HD-3D compatibility with existing 2D infrastructures, having the same Blu-ray disc, read by any existing Blu-ray player as a 2D video stream in full High Definition, without any loss in quality, color, frame rate or resolution. When using an updated TDVReady decoder, the user will enjoy the best 3D experience at home, saving the need for a separate 3D release for Blu-ray and broadcast.

"TDVision has enabled a media breakthrough with our AlterSpace product by enabling the world's first and only stereoscopic virtual world featuring media sharing and viewing. Compatible with multiple 3D display devices, this will be the industry standard that others go by. We are proud to be a stereoscopic technology steamroller, and we are working hard to blur the boundaries between 3D cinema and gaming. Alterspace, our 3D working environment and interface is so realistic you won't believe your eyes," said Ethan Schur, Director of Product Marketing for TDVision Systems.

Attendees at SID 2008 will experience the possibilities of navigating the AlterSpace 3D environment, now enabled for Samsung 3D ready DLP, and even watch 3D videos inside this customizable AlterSpace environment. TDVision has erased the line between reality, pre-recorded images, and computer generated content.

"The TDVisor exceeded my expectations for an affordable head-worn binocular 3D display. First, the image quality has a higher pixel count and wider FOV, but more important are the separate right and left video input channels. A number of other head-worn displays have a single video input, requiring two live camera outputs that must be multiplexed before being fed into the display's single input port and then again de-multiplexed by the display itself. The TDVisor keeps it clean and simple. Two channels of input are displayed on the two near-to-eye displays," said John Merritt, expert of stereoscopic displays and applications and Chief Technology Officer of The Merritt Group. "Combined with the versatile CODECs developed by TDVision, this is an unbeatable combination, making it simple to integrate into a wide variety of stereoscopic imaging systems. What's more, when combined with a fast head-tracker for intuitive interactivity, the TDVisor will provide a significantly heightened sense of immersion for gamers and tele-robotic systems operators," he continued.

TDVision provides the user with the freedom of choice to visualize High Definition in 2D, 3D on a Texas Instruments DLP based television like those made by Samsung and Mitsubishi or in a portable and immersive fashion using the TDVisor, a head mounted display.

Source: Forbes

James Cameron: 3D Heading Beyond Movies

"Filmmaker James Cameron sees the world in stereo. So does everyone else, though, and that's exactly his point.

"When you are viewing in stereo, which is what we do," Cameron said, "more neurons are firing. More blood is pumping through the brain."

Cameron has been a big proponent of making movies in 3D, but he said that the digital projectors going into movie theaters are capable of showing more than just movies. Cameron's talk came as part of Microsoft's Advance 08 advertising conference, which runs through Wednesday.

"That digital image can be live," Cameron said. "That digital image can be 3D."

James Cameron speaking at Advance 08 advertising conference


He suggested such locations can show live sports and events, alongside impressive travelogues and other content.

"We're not quite there but we are on the cusp of that and people need to have a strategy for it," he said.

More than 1,000 theaters in the U.S. already have stereoscopic (3D-capable) projectors, while Cameron hopes that there will be 5,000 such facilities by the time his 3D movie Avatar debuts next year.

3D movies have often generated much more revenue than 2D versions of the same film, a potential boon to the entertainment industry. Retrofitting theaters with 3D technology is expensive and difficult, though, and some 3D advocates are unhappy with the pace of adoption.

"I feel as though things have dragged along, and it's been pretty disappointing," DreamWorks Animation SKG Chief Executive Jeffrey Katzenberg said in April, according to Reuters.

3D isn't just for theaters. The real revolution, Cameron said, comes as games and television also start appearing in three dimensions.

"Stereo production is the next big thing," he said. "We are born seeing in three dimensions. Most animals have two eyes and not one. There is a reason I think."

He noted that games, in particular, stand to benefit. First-person shooters become true first-person experiences, he said.

"You are in the game," he said. "This is the ultimate immersive media."

He noted that Ubisoft, which is making the game version of Avatar, already has a stereoscopic game up and running using a standard Xbox 360 and 3D glasses.

Cameron said that displays for laptops, phones, and Zunes can be made stereoscopic even without needing special glasses. The Windows operating system, Cameron said, should be viewable in 3D.

"They should be talking to their various partners," Cameron said.

Earlier in the day at the conference, Microsoft announced a new "Microsoft Advertising" brand to try to unify its disparate tools for advertisers and publishers as well as an effort to start selling display advertising on mobile phones."

By Ina Fried and Stephen Shankland, CNET News

Considering the Future of 3D

"Bruce K. Long is a 3D enthusiast. Earlier this year he was appointed CEO of Iconix Video, a camera manufacturer and 3D integrated service provider that develops point of view (POV) cameras such as the new HD-RH1F camera, the Studio2K and remote control units that represent an assortment of 3D solutions. True to his roots as a filmmaker with a background in production and post, he is poised to bring 3D technology to mainstream films in his new role.

Long’s journey to Iconix included a stint as president and COO of National Lampoon, where he oversaw production, distribution and network operations. Prior to that, he served as executive vice president of strategic planning and business development at Technicolor Creative Services.

Do you see 3D technology replacing the way we watch movies now?
I’m one of those guys who has been wild-eyed since he first saw 3D and the potential for the 3D viewing experience. It’s possible that soon, when home video begins to embrace stereoscopic playback, all motion pictures are going to transition to 3D. That doesn’t mean that I want all movies to be like an amusement park experience, where they’re throwing a spear through the camera, but I believe that everything from A Room with a View to Animal House to Beowulf deserves to be experienced in 3D.

There’s a whole genre of independent movies and traditional non-amusement park-style experiences that would be made even more powerful by the immersive elements of 3D. I’d love to see all movies go 3D eventually.

How does your company bridge the gap to make 3D accessible for filmmakers?
We’re trying to deliver rigs and 3D camera setups that are less expensive. The 2K Iconix camera costs around $16,000, and it’s competing in the marketplace with cameras 10 times that price. We’re trying to allow the rental companies to deploy with a lower cost of entry and, therefore, facilitate production overall.

The second thing we’re doing is really practical. We are getting filmmakers comfortable on set with 3D. We’re working closely with great 3D companies like 3ality and Paradise. We provide the ITs who are trained to make sure our gear fits with their needs very carefully and very specifically.

I’m excited by 3D and there’s the buzz around 3D, but the business model still needs to be figured out. We take that kind of responsibility seriously.

How do you go from being a camera company to an integrated 3D service provider?
We have a tremendous beachhead in the Iconix camera. But the industry did not have an overall, end-to-end solution for stereoscopic 3D. We set out in October to expand the company from the cameras to the integrated solution approach. That process has involved forming strategic partnerships with various companies.

At NAB, Iconix showed an end-to-end camera-to-postproduction stereo 3D solution. The initiative involved making sure our products could do stereo storage, stereo playback, stereo conform, color and title. In July, we’re offering that solution in 2K. It’s a big step forward for us.

What are the challenges for 3D theatrical release?
The challenges today for stereoscopic are both in distribution and production. We’re dealing with the rollout of stereoscopic infrastructure in the theaters and in home video.

The first challenge is establishing theaters with digital cinema capabilities and, therefore, stereoscopic playback capabilities. That seems to be evolving pretty quickly, but it's one of the big hurdles still.

The second hurdle for us is helping Hollywood transition to 3D production. Our goal is to help filmmakers produce 3D for 10 or 15 percent more than they normally would in 2D production--as opposed to 30 or 40 percent more, which is where the market appears to be right now.

As a filmmaker myself and someone who understands the challenges that happen on set, the third hurdle is to help mainstream filmmakers transition to 3D stereoscopic production and the workflow involved there. We are mapping out a workflow for stereoscopic 3D that is not so different from 2D that it becomes cumbersome. That means supporting folks on set who are doing their first 3D movie."

By Joy Zaccaria, Videography

Regal Entertainment Group and Real D Partner to Expand Regal’s Premium 3D Platform

Regal Entertainment Group, the owner and operator of the largest movie theater circuit in the world, and RealD 3D, the global leader in 3D, today announced an unprecedented partnership. The deal calls for a rollout of 1500 RealD 3D screens, bringing the RealD 3D screen count to over 3500. The rollout will allow most U.S. markets to have 3D capability and will commence upon the completion of the Digital Cinema Implementation Partners (DCIP) initiative. Regal and RealD will work together to market and develop the RealD premium 3D platform in key Regal markets and theatres throughout the U.S.

The consummation of the announced deal is subject to completion of definitive agreements and is contingent upon DCIP finalizing the necessary digital cinema conversion arrangements.

Source: DCinemaToday

Philips Demonstrates "3D TV" Without Glasses

Even before we get used to high-definition TV, researchers are planning to place "3D" TV in our homes but without the funny glasses.

Philips Electronics NV gave a peek into its research pipeline Tuesday, demonstrating a prototype that was still fuzzy around the edges. Operating like a holographic greeting card, it combines slightly different angles of the same image to create video that appears to have different depths as your eyes scan it. The result is uneven, at some moments blurry, and at others merely two dimensional. But sometimes the apparent depth or protrusion can be startling.

"We say the market progression is black and white, to color, to high definition, to 3D," said Bjorn Teuwsen, demonstrating the product. "We estimate in a few years these will be in homes."

Specialized models have been sold to corporations - mostly movie theaters and casinos - where they are usually used for advertising signs, since 2006. But Philips said the product is not yet ready for consumer rollout.

Samsung is demonstrating its own no-glasses 3D television concept model this week in Las Vegas.

Source: The Sidney Morning Herald

Final Destination 4: Set Visit

"Final Destination 4 is being shot in HD 3-D using the PACE camera system, James Cameron's technique of choice on the upcoming Avatar. And FD4 is actually the first movie filmed on practical locations to use the technology. Proponents of next generation 3-D, most notably Cameron, have been saying for a while now that the technology will soon shed it's gimmicky perceptions and may even cross the genre barrier into dramas and the like.

Believe it or not, Final Destination 4 looks like it could be the beginning of that. Yes, it has plenty of shocking deaths that make use of the 3-D presentation in just the way you'd expect (a severed head comes flying at the camera at one point), but in the footage shown to us by Perry the more mundane moments are just as impressive for their immersiveness. There's an establishing shot of a mall interior, for instance, that really pulled us into the scene. And simple things like watching the characters talking... the 3-D really makes you feel like you're there with them. Then there are the scares, like an ingenious car wash sequence featuring actress Haley Webb where the machinery malfunctions causing her Scion to get stuck on the track... "and bad things happen," as producer Craig Perry quipped frequently during our time on set.

Working in the 3-D medium has been a thrill for director David Ellis. And it's not just the action scenes that make it exciting. "The coolest thing about the 3-D is the world. I think the dialogue scenes are really cool because you feel like you're there in the moment."

Final Destination 4 doesn't have a release date yet, but we expect it to hit theaters in early '08. That gives theatrical exhibitors that are working steadily to add more 3-D screens to their movie houses some time. It's an experience you won't want to miss."

By Brian Linder, IGN

AVS in the Picture as Global Standard

Zhang Xiaoqiang, bureau chief assistant of the Shanghai Division of Institute of Computing Technology (SHICT) under the Chinese Academy of Sciences, is adamant about the benefits of China's development and industrialization of its homegrown audio video coding and decoding standard, AVS (Audio Video coding Standard). For him, throwing off the shackles of having to pay huge royalties for using foreign-developed standards is a huge shot in the arm for any technology sector.

The audio video coding and decoding standard is used in all audio video applications such as mobile TV, digital terrestrial TV and video surveillance. The video signal, before being transmitted, is coded or compressed at the source so that it can be transmitted at a fast speed. When the signal reaches the receiver, it needs to be decoded by a player program so that the user can see the video.

"In the past, standards developed by foreign countries, such as MPEG-2, MPEG-4 and H.264, monopolized the market. China had to pay very high royalties for audio video coding and decoding technologies used in DVD players, digital TV equipment and even the delivery of digital TV programs," Zhang said.

"The royalty for each DVD player is as high as $20, even though the gross profit is maybe only $24. Foreign patent owners normally do not claim the royalties until the market reaches a certain scale. They then will claim royalties, including the royalties that are incurred in the past," Zhang said.

China's audio video industry is expected to reach a scale of RMB 150 billion ($21.4 billion) in 2010, and the industry needs homegrown a audio video coding and encoding standard for faster growth, Zhang said.

AVS the key to unlocking industry's potential
The SHICT is a non-profit organization founded by the Institute of Computing Technology under the Chinese Academy of Sciences and the Pudong New Area government in Shanghai. It promotes the development and industrialization of homegrown technologies and fosters start-up enterprises in the Pudong New Area.

THe SHICT has been actively involved in the development and industrialization of AVS. It is a founding member of the AVS Industry Alliance, an organization that promotes the industrialization of AVS. It owns two patents related to AVS technologies and the copyright for six pieces of AVS-related software. In September of last year, the SHICT launched the world's first Internet Protocol (IP) camera based on AVS and put it into mass production in December. The camera is branded Longcam.

In 2002, the decision to develop a homegrown audio video coding and decoding standard was made at the Xiangshan Science Conference. Afterwards, three organizations were established to promote the development and industrialization of AVS, namely the AVS Workgroup, the AVS Patent Pool Executive Council and the AVS Industry Alliance.

China owns 90 percent of intellectual property rights in AVS, and the Institute of Computing Technology owns approximately 50 percent of that portion. There are now more than 170 members in the AVS Workgroup. Members of the AVS Workgroup pay only a royalty of RMB 1 ($0.14) for each device that uses the AVS standard. China's Ministry of Public Security has named AVS an industrial standard for security video surveillance.

Industrialization of AVS
"AVS is mainly used in six types of applications, namely IPTV, digital terrestrial TV, mobile multimedia applications, direct-to-home broadcasting, high-definition video storage and video surveillance," Zhang said. AVS-based products include chips, software and devices such as set-top boxes, audio video coders and servers, he said.

In September 2005, the SHICT founded the Mobile Video & Audio Industry Alliance in Shanghai to promote the industrialization of AVS in mobile multimedia applications. It successfully developed an AVS-M coder in 2005 and an AVS-M decoder intellectual property core in 2006. AVS-M is the AVS video coding standard targeting mobile multimedia applications.

Shanghai Longjing Microelectronics Co. Ltd., Spreadtrum, Broadcom and ST have all launched AVS-based audio video decoder chips. Beijing-based USC, SVA and Envivio are providers of AVS-based audio video coders. More than 10 manufacturers, including Changhong, Hisense, ZTE, UTStarcom, SVA, Huawei and TCL, provide terminals that support AVS.

"MPEG-2 is already widely used in China's broadcasting industry and the country has to pay large amounts of royalties for products that use the MPEG-2 standard. SVA has developed an MPEG-2 to AVS signal converter that can help to avoid royalties," Zhang said.

Last year, digital terrestrial TV trials using AVS as the audio video coding and decoding standard and DMB-TH as the transmission standard were launched in Hangzhou, Chengdu, Shanghai and Baoding in Hebei Province. DMB-TH, also known as GB20600-2006, is China's national digital terrestrial TV standard. In April of this year, the country's first IPTV services utilizing AVS were put into commercial use in Dalian.

Going global
In September 2006, the AVS Workgroup established a relationship with the ITU Telecommunication Standardization Sector (ITU-T), which marks that AVS is on the way of becoming an internationally accepted audio video coding and decoding standard, Zhang said.

"AVS, the ITU H.264 and Microsoft's VC-1 are now the three major audio video coding and decoding standards in the world." Zhang said.

In February 2006, the video portion of AVS was approved as China's national video coding and decoding standard. In December of last year, the audio portion of AVS was published and submitted to the former Ministry of Information Industry.

"AVS will soon become a national audio coding and decoding standard," Zhang said.

The Longcam's longview
In November of last year, the SHICT's Longcam was presented the 2007 China Audio Video Industry Innovative Application award at the Third China Audio Video Industry Technology and Application Forum.

"The IP cameras on the market are normally priced at over RMB 3,000 ($428.6), while ours is RMB 1,000 ($142.9) to RMB 2000 ($285.7). We will try to lower the price to approximately RMB 500 ($71.4) to RMB 600 ($85.7) so that general consumers can afford them. Now it is possible to lower the price of IP cameras thanks to the low royalties of AVS," Zhang said.

"Once the price is reduced, IP cameras can not only be used in public security surveillance, but also by general consumers. For example, people can monitor their babies or pets on the camera. China Telecom already has a service that allows parents to watch their babies. It is our vision that people will be able to monitor their homes on their mobile phones using our solution in the future," Zhang said.

On the other hand, the government is actively promoting homegrown technologies and AVS will eventually replace H.264 and MPEG-4 in China, he said.

"The potential of the AVS standard market is huge. China's Ministry of Public Security has named AVS an industrial standard for security video surveillance. It has requested that all video surveillance records in Internet cafes must be in the AVS format," Zhang said.

The SHICT recently received contracts to supply approximately 30 Longcam IP cameras to the SME (Small and Medium-sized Enterprise) Business Center in Zhangjiang Hi-Tech Zone and about 10 cameras for a science education project organized by the Pudong New Area Science Association.

The institute is also in talks to supply the Longcam to China Netcom's subsidiary in the Inner Mongolia Autonomous Region for use in video monitoring services.

"We are developing a human face recognition application of the camera together with Shanghai-based Isvision Technologies. Such a solution can be used in intelligent surveillance at the site of the 2010 Shanghai World Expo," Zhang said.

The SHICT is currently looking for funding from venture capital firms to commercialize its Longcam product.

"Only when we establish a company can we commercialize this product. The company will be jointly owned by the SHICT, the team members and the venture capital firms," he said.

Source: Interfax China

Samsung Electronics Offers a Glimpse of the Future for TV Displays at SID 2008

Samsung will exhibit a multi-view digital information display (DID) that delivers 3D images without requiring special glasses. The company is confident that this display can establish a new niche market apart from other DIDs previously introduced.

Source: BusinessWire

Dawn of the Dead Goes 3-D

"George A. Romero's Dawn of the Dead will be "dimensionalized" to stereoscopic 3-D for a planned theatrical release.

New Amsterdam Entertainment has tapped 3-D company In-Three, which will use its proprietary "dimensionalization" process to turn the 1978 indie horror flick movie into 3-D. The project is expected to be completed within the year.

So far the only legacy 2-D film that has been converted and re-released in digital 3-D is Tim Burton's The Nightmare Before Christmas, which Disney released in October 2006 in 168 theaters, grossing $8.7 million. Disney reissued the film in October and plans to repeat this year and in 2009.

There are slightly more than 1,000 3-D-ready digital screens in the domestic market, and that number is expected to grow.

In-Three uses patented software tools and techniques to create a second camera image from a 2-D image. Each frame is "dimensionalized," meaning that all objects are moved forward or backward from the screen or in relation to one another so as to achieve the desired dramatic effect.

In-Three "dimensionalized" the Star Wars 3-D demo clip that first screened at ShoWest in 2005.

Conversion of legacy material using the process starts at about $50,000 per minute and can reach more than $100,000, depending on the complexity of the imagery in terms of visual effects and other elements.

"We are seeing interest now that people realize there will be sufficient screens to justify the cost," In-Three's Neil Feldman said."

By Carolyn Giardina, The Hollywood Reporter

Steven Spielberg about Digital Cinema

Steven Spielberg has not walked the red carpet since The Color Purple was selected out of Competition 22 years ago. He is back on the Croisette with the long-awaited Indiana Jones and the Kingdom of the Crystal Skull.

"The film is being released digitally on a lot of screens, about 300. Making a film digitally and releasing a film in the same digital process gives a beautiful image. It creates an extraordinarily clean, sharp image, but making a film on celluloid - as I’d like to do with all of my pictures - then transferring, releasing it, and projecting it digitally is a very inferior image. So the decision to go out to a vast number of motion picture theatres was a simple decision for me to make. But digital cinema is inevitable, it’s right around the corner and even someday I will have to convert, but right now I love film.”

Source: Festival de Cannes

Photonica

Having established world-firsts in its drive to build the most-advanced display-projector technology on the planet, a 4K digital cinema projector, the Photonica team's new total media system architecture strategy is carrying forward the next generation of innovation in core hardware visual display programs.

It is this combination of Photonica's next-generation core hardware display programs, including fiberoptic and magneto-photonics for 4K+ digital cinema projection and consumer flexible, dimensional, and flatscreen displays, with Photonica's new media system integration IP, designed to ensure the success of each key component, that makes the NeoSpace Systems solution, the only solution available today, to bring about the world's next revolution in visual entertainment.

Category Defining Digital Projector Products:
- Projectors with Photonica light engines will deliver industry leading brightness, power efficiency and pixel density and an unprecedented switching speed.

- This significant improvement in switching speed will enhance the delivery of anti-piracy protection and greatly improve 3D capability to the entertainment industry.

Breakthrough Display Technology:
- The Company’s unique magneto-photonic technology delivers both a significantly better viewing experience and a more durable, stable and cost-efficient light engine.

- Most importantly, the textile-matte fiberoptic technology enables large and small flexible woven displays, ranging from portable “rollable” woven screens to flat panels that are larger, lighter and cheaper than plasma or LCD, as well as view-through “wearable” displays and dimensional and holographic displays.

Disruptive Market Pricing:
- The low manufacturing costs on all Photonica products will enable product and system manufacturers to generate attractive profit margins.

- Projectors based on the Company’s component hardware technologies will reduce the initial investment and ongoing operational costs for theatre owners as they convert from film projection to digital projection.

- As the hardware program yields Photonica’s own first-generation 4K technology for digital cinema, this will be adopted as an upgrade path by the exhibitors who have already implemented NeoSpace 1.0 solutions for the digital cinema conversion needs.

SafeNet Acquires Beep Science

"Oslo-based DRM vendor Beep Science has been acquired by US security technology company SafeNet. The deal, terms of which were not disclosed, was signed a couple of months ago (without publicity) and closed last week. Beep Science's team will join SafeNet offices in Helsinki, Amsterdam, and other European locations. Beep Science's client and server technology for OMA DRM will complement SafeNet's existing DRM server software business, which it acquired from DMDSecure in 2005.

SafeNet has made a string of acquisitions in recent years, including antipiracy service provider MediaSentry, also in 2005. The company started in the 1980s as an enterprise security technology vendor and has essentially become a roll-up.

Its acquisition of Beep Science makes sense from the standpoint of filling gaps in its product line. SafeNet has DRM server software (DRM Fusion) that supports multiple DRMs, including OMA DRM as well as Microsoft Windows Media DRM, and it has considerable expertise in core security technology for consumer device hardware. It counts Samsung, TI, and AMD as customers for the latter. With Beep Science, SafeNet will get both DRM client software and expertise in integrating DRM capabilities with consumer devices. It is now able to sell end-to-end DRM technologies to consumer electronics companies.

The risk in this deal is that Beep Science is exclusively an OMA DRM technology company, so its success is predicated on the future of OMA DRM, particularly OMA DRM 2.0. Although OMA DRM 1.0 has a very large installed base in mobile handsets worldwide, the number of services that have actually deployed OMA DRM 2.0 today is very small. SafeNet expects growth in OMA DRM 2.0 adoption to come in the mobile gaming, mobile broadcast, and e-book spaces, all of which will take some time to develop. It had not been possible for the small, monolithic Beep Science to wait around for OMA DRM 2.0 to take hold in environments like OMA BCast DRM Profile for DVB-H, but a larger and more diverse company like SafeNet can afford to look to a longer time horizon for its revenue.

Still, mobile DRM is an uncertain and fragmented market, even where it is eventually adopted; several technologies are in play, including Microsoft's PlayReady, Marlin, SDC, and potentially Apple's FairPlay in addition to OMA DRM 2.0. It remains to be seen whether SafeNet can successfully integrate Beep Science's technologies with the ones it already has and translate that into opportunities that meet the needs of this ever-changing market."

Source: DRM Watch

James Cameron and Vince Pace Speak at Prestigious Cannes Film Festival

Vince Pace, one of the thought leader's in 3D entertainment joined legendary filmmaker and partner, James Cameron yesterday to participate in The American Pavilion's "3D Day" during this year's Cannes Film Festival. The two innovators of the PACE/Cameron Fusion system and world renowned leaders in 3D production, participated in the panel discussion via a Skype video call from Los Angeles where James Cameron continues production on the highly anticipated 3D feature film "Avatar" with Pace serving as Director of Photography.

The 3D Day included two topic panels that were combined into one long panel discussion, "New Dimensions for 3D: How Digital 3D Will Shape Movie Production and Distribution During The Next 20 yrs" and "New Technology Driving Digital 3D."

Vince Pace and James Cameron

The panel proved invaluable to the industry as the long time misperceptions of 3D production were lifted as legendary and well respected Director, James Cameron, along with Cinematographer and CEO of PACE, Vince Pace, expressed the true realities of the technology, its integration, and the financial impact of stereoscopic film making. With most industry speakers touting a 20-40% monetary increase, Pace stepped up to the microphone and brought to the table Final Destination 4, which is currently in production utilizing the latest PACE/Cameron Fusion F23 System. Having budgeted only a 10% - 15% cushion, production is currently ahead of schedule and under budget.

Pace and Cameron went on to clear the air as to the number of available acquisition systems currently in market for immediate use on production. The current Fusion System line, the world's largest inventory of 3D systems, could easily service multiple high level feature films simultaneously. Both veterans stated that it is a matter of demand pushing the number of systems in the market, not the ability to actually produce them, and challenged film makers to join the revolution. PACE has the immediate ability to produce additional systems due to the internalized infrastructure of the proprietary technology.

Source: PR Newswire

Clix 3D IPTV

"Clix, the triple play brand of Portuguese telecommunication operator Sonaecom, has launched a new VOD offering on its SmarTV service which offers movies and documentaries in 3D."

Note: This 3-D offer seems to be based on the Pulfrich effect.

Source: Advanced Television

Insight Media Releases 2008 3D Television Report

Insight Media is pleased to announce the release of the 2008 3D Television Report. This 265-page report, written by Insight Media Senior Analysts, Matthew Brennesholtz, Art Berman, Chris Chinnock, Michael Kalmanash, Dale Maunu, Bernard Mendiburu, is priced at $3,995.

Historically, the 3DTV market has been very small, with the solution being add-ons that only worked with CRT displays. Advances in display technology and 3D video formats, plus additional 3D content from the cinema, games and other sources, means this is likely to change in the coming years. This report is designed to provide a solid technical and market analysis of the 3DTV options and forecast the penetration of this technology into the worldwide television market.

Main sections of the report include:
- Executive Summary
- Human Factors of 3D Stereo Perception
- Existing Content for 3DTV
- New 3D Content Creation
- 2DTV Display Technologies
- 3DTV Display Technologies
     - 3DTV Projection Systems (Front and rear projection)
     - 3D AMLCD Television
     - 3D Plasma Television
     - 3D OLED Television
- 3DTV Content Delivery
- Forecast and Prospects for 3DTV
- Glossary of 3D Terms and Acronyms

Click here to see full Table of Contents.

Understanding the Ins and Outs of 3-D Stereoscopic Cinema

An interesting paper by Enrique Criado-Sors Cortés
Source: Enxebre Entertainment

The Many Ways to Create a 3-D Image

An interesting paper by Chris Chinnock
Source: Insight Media

James Cameron to Stick with 3-D

"James Cameron is looking to continue his pioneering stereoscopic 3-D efforts -- though not necessarily with a big action movie or visual effects-laden project like "Avatar."

"After 'Avatar,' I want to do something a lot smaller," Cameron said Thursday.

On the director's sonar is "The Dive," a true story about the romance between controversial Cuban free diver Francisco "Pipin" Ferreras and Frenchwoman Audrey Mestre. Under his guidance, Mestre became a free diver who broke several world records but died in 2002 while competing. (Free-diving competitors must hold their breath for long periods of time while deep under water.)

"It's a drama, a love story," Cameron said. "This will require underwater photography, which will look gorgeous in 3-D."

When dramatic 3-D is achieved, Cameron believes it could bring about "the kind of uncomfortable feeling that people like in a movie theater; they feel like they are being challenged. It can actually be quite powerful.

"I think (3-D for drama) is the big overlooked area (now) because the economics don't really drive that direction." Cameron added that action and computer animation will likely drive the 3-D market for a while, but with the infrastructure gains on the way, he expects that will change.

"The visual aesthetic of doing dramatic stuff in 3-D is very simple," he said. "Just don't remind people that they are watching a 3-D movie. That will take them out of the experience."

Ferreras was a central figure in a 2001 Imax documentary, "Ocean Men: Extreme Dive."

"Avatar" is scheduled to open Dec. 18, 2009."

By Carolyn Giardina, The Hollywood Reporter

3D and Film Factors

"From a technology standpoint, NAB 08 will be remembered for proving that the latest obsession with stereoscopic 3D is for real, and that the physics behind digital acquisition still need intensive R&D if they are to even equal the chemistry (dynamic range) behind film.

There were inspirational 3D demonstrations on the finishing side by Quantel, and from The Foundry and Digital Vision with upcoming software releases, but it was Quantel that put in the early leg work with its 3D enabled Pablo, attracting 6,000 people to 43 road show events around the world, over 700 people to its ticketed stand event, and selling 19 stereo capable systems in six months. The new tool was simultaneous left and right eye ingest, but Strategic Marketing Director Mark Horton was happy to talk first about the state of the market. “It is not a fad. The issue is not so much of the technology; it’s one of knowledge. Once you have got that knowledge the technology is fairly easy to understand,” he said. “What there isn’t in the industry yet is a sufficiently large enough number of people who know how to shoot the material. The limiting factor will be how quickly the industry can gear up, particularly as there is a lack of offline systems currently. The Foundry has got some interesting tools, but they are doing the compositing side,”he added.“We are the entire workflow. If people give us material and they want a delivery master at the end we can do everything in the middle.”

Bill Collis, CEO of The Foundry, was showing off Ocular, a set of tools for people working on live action stereo. “What we’ve done is resolve the hard maths, so we’ve now got a dense disparity map linking the left and right eyes,” he said. “What you see with most manufacturers is that all they are doing is a horizontal X shift on one eye on planes of the image. This is fine, but it won’t do a proper intraocular distance across all depths. We are going to see an awful lot of rubbish stereo that makes your eyes hurt, but some people will spend a great amount of money, such as Jim Cameron for Avatar, to get it right. And they will produce very pleasing films that really add something,” he added. “The worry is that all the rubbish will put consumers off the quality projects.”

Kodak’s Bob Mastronardi, worldwide TV segment manager, had a broad grin. He said: “All we’ve tried to do is improve our emulsions, raising the bar so that where we are advantaged in the highlights we’ve taken that two steps further.Where we had some disadvantages, in under exposure with grain, we’ve improved that as well. It was interesting that at both the Red Camera and Sony F35 presentations they pointed out that they are still not as good as film in terms of dynamic range,” he added. “After all that R&D money was spent, film is still the benchmark.” The camera manufacturers introduced many good new models working at 1080p and 2K, but they had a fit of honesty over 4K.

“There is no real 4K camera around,” admitted Jan Eveleens, Thomson’s GM of image capture. “Some people count their pixels in a clever way, but they are not real 4K. This is just pure physics, not marketing. A 4K camera that beats film is going to take a lot of blood, sweat and tears.”

Speaking for top Sony dealer Band Pro, CTO Michael Bravin said, “I think it will be two years before we have high performance 4K imaging. What I really want to see is a full-res 4K 35mm sensor camera by 2010.”

By George Jarrett, TVB Europe

Samsung Develops World’s First "Blue Phase" Technology to Achieve 240 Hz Driving Speed for High-Speed Video

Samsung Electronics Co., Ltd., the world’s largest provider of thin-film transistor liquid crystal display (TFT-LCD) panels announced today that it has developed the world’s first “Blue Phase” LCD panel – which will offer more natural moving images with an unprecedented image-driving speed of 240 Hertz. Samsung is planning to unveil a 15” model of its Blue Phase LCD panel at the SID (Society for Information Display) 2008 international Symposium, Seminar and Exhibition, which will be held in Los Angeles from May 18 to 23.


Developed with an extremely cost-efficient design, Samsung’s Blue Phase mode does not require liquid crystal alignment layers, unlike today’s most widely used LCD modes such as Twisted Nematic, In-Plane Switching or Vertical Alignment. This new Blue Phase mode can make its own alignment layers, eliminating the need any mechanical alignment and rubbing processes. This reduces the number of required fabrication processes, resulting in considerably savings on production costs. Additionally, Blue Phase panels will reduce the possibility of bruising the LCD panel interface whereby pressure on the screen can impair uniform brightness.

Overdrive circuits are currently applied to each LCD panel to improve the video image quality in premium LCD TVs, which are driven at 120Hz. The Blue Phase mode features a superior response rate, allowing images to be reproduced at 240Hz or higher without the need for any overdrive circuit. The term “Blue Phase” was coined when the technology’s developers observed bluish hues while watching their new liquid crystal mode in operation.

Since many academic and corporate institutions researched this new liquid crystal mode, Samsung has become the first to unveil a commercially viable product prototype using the “Blue Phase” technology.

Samsung expects to begin mass producing its Blue Phase LCD in 2011. The LCD panels will be mainly used in TVs that require high-speed video reproduction.

Source: Samsung

Can You Make a Live-Action 3D Movie on a Budget?

"It's been known for a while now (at least since the 2005 release of Chicken Little) that animated films do especially well in 3D versions. But the success of recent 3D projects like U23D, and especially Hannah Montana/Miley Cyrus: Best of Both Worlds Concert Tour — which opened to a stunning $31 million weekend on a relative handful of digital 3D screens in the U.S. — has many in Hollywood taking another look at the feasibility of 3D for live action, especially on a tight budget.

This summer's release of Journey to the Center of the Earth 3D will help answer some questions about the business model for live-action 3D, but it's doubtful that anyone has pushed the low-budget envelope quite as far as the crew behind Dark Country, a 3D thriller from Stage 6, a new distribution label within Sony Pictures for low-budget films. With a reported budget of just $7 million, director/actor Thomas Jane, cinematographer Geoff Boyle, and the team at digital-3D specialist Paradise FX had their work cut out for them.

That work included constructing what may be the world's smallest 3D camera rig—small enough to be mounted inside the new MK-V AR camera-stabilization system, a Steadicam-like rig that allows the operator to switch from high-mode operation (with the camera on top of the rig) to low-mode operation (with the camera picking up extreme low-angle shots from the rig's low end) in the same shot. That's where the SI-2K Mini—a 2K optical block about the size of a cigarette pack—came in.

The SI-2K Mini stereo rig

"When you think of 3D, you think of a camera half the size of a VW bug, and of Alfred Hitchcock literally digging trenches in the studio to film Dial M for Murder," director Thomas Jane tells Film & Video. "In the past, you were very limited by a 3D camera. But in this film, we were extremely mobile. Most filmgoers will probably take it for granted. They won't realize all the new ground that we've broken with this." Dark Country also used a stereo rig with Red cameras, but because it was very heavy and hard to fit into tight spaces, the smaller and lighter SI rig was used for Steadicam-style shots.

The SI-2K Mini stereo rig mounted atop
the MK-V AR camera-stabilization system


A front view of the rig

Jane calls Dark Country a "film noir psychological thriller," and says it's great subject matter for 3D. As an example, he cites the film's key location — the interior of a baby-blue 1961 Dodge Polera. "When we put the camera in the back seat of the car, you feel like you're in the back seat," he says. "You have the background, which is the landscape out in front of you; the midground, which is the bonnet of the car and the windshield; and the foreground is your subjects, the people and the seats. This enhances the stereoscopic effect. The idea of getting inside people's heads and creating a universe the audience can really feel like they're participating in was the challenge of making Dark Country."

So what is it about a low-budget thriller that demanded all this cutting-edge hardware? Jane doesn’t hesitate to describe one shot that made it all worthwhile. “We did a shot where we tracked along with a character, did a 360-degree move around the car as he got into the car, and then the car took off out of the parking lot and onto the highway, disappearing into the mountains,” he says. “We ended in a big, wide high shot — we had to build a ramp. Our MK-V operator followed me out of the restaurant, went all the way around the car, then followed the car, ran behind the car as the car exited the parking lot, and then, as we exited on the highway, walked up a 40-foot ramp that we constructed to get this big, high vista.

“Of course, we shot it all in stereo. It proved to be quite challenging, but it’s really effective when you see the final shot. It’s great, because you start on a rather close subject, and then as the car takes off we get farther and farther away and end up on this wide, beautiful vista.”

Workflow
No matter how great the subject is, 3D isn't easy. With the exception of cinematographer Geoff Boyle and A-camera operator Howard Smith—who wielded the MK-V AR rig—the entire camera department on location was affiliated with Paradise FX, a Van Nuys production company that's been doing 3D work for more than 15 years, and digital 3D for almost nine, according to Jim Hays, digital workflow supervisor. That meant the company had the knowhow to make connections between production and post-production workflows, which is key to making the process work.

To date, Hays says, 3D workflows handled by Paradise FX have largely involved Final Cut Pro and After Effects, partly because those tools are friendly to lower-budget productions. For Dark Country, however, Paradise decided to work with Iridas SpeedGrade—partly because Iridas was compatible with the CineForm RAW codec out by the SI-2K Minis, but also because it had a built-in feature to automatically flip one of the two eyes when stereo footage is dropped on the timeline. (The mirror used in the 3D beam-splitter rig reverses one of the two camera images during acquisition.) “None of the tools that we had before would allow us to do that,” Hays says. “That was the biggest help in terms of working with the footage on location in Albuquerque.”

Small and light as it was, that’s not to say the SI-2K Mini stereo rig was a breeze to work with. For one thing, the first-generation camera-control and recording system was fairly primitive, with separate gigabit Ethernet cables running to two different laptop systems that were mounted on a board that had to be kept near the camera operator. Each computer recorded one of the two stereo views, including Iridas .Look files (metadata for non-destructive color grading), in the CineForm RAW format. (Silicon Imaging has since demonstrated a new version of the control-and-recording system, dubbed SiliconDVR, that will record the footage from both cameras to a single system via gigabit Ethernet.)

Although SpeedGrade could feed a Samsung DLP screen for 3D playback using active stereo glasses, it was still difficult to organize a timely schedule for viewing dailies on the time-strapped, low-budget production. “It was a very grueling struggle to make Dark Country on budget, and everyone had to do a lot more than would normally be expected of them,” explains Hays. “They didn’t get a chance to see the 3D dailies until about a week into production, mostly because of time commitments. But once they saw it on the Samsung DLP TV that we took there, the people who hadn’t been involved in 3D before got it. It’s an intangible benefit.”

'Free-Viewing' in 3D
For a dedicated 3D buff like Jane, however, there were ways of checking his work in stereo as he went along. “With the Red cameras, we had a live image in stereo—we had a monitor on the camera so we could put on polarized glasses and see the image, which was really great,” he recalls. “But with the SI cameras, we just had the two images on the laptops that were nailed to this goddamned board, and because of our budget we didn’t have a monitor that could flip [the images] back and forth in stereo.

“We ended up free-viewing,” he says, referring to an age-old tactic for merging a side-by-side stereo image pair by working your eye muscles to overlap two images and create a fused 3D image. “Some people can do it more easily than others. Fortunately, I had some experience, so I could check the stereo in an image if I crossed my eyes. It was really primitive, but it worked.”

“We were basically beta-testing these new systems,” Jane continues. “Different problems arose through production, all of which we were able to solve, but it slowed us down a little bit. Future productions will benefit from the ground that we broke. The second-generation SI-2K Mini cameras and the MK-V system will be tetherless. The capturing systems will be on board, and hopefully the motors will get smaller. What we want is ease of use. We want to be able to use the system just as quickly and conveniently as we would a 2D rig.”

So what is the prognosis for stereo filmmaking on a budget? Iridas CEO Lin Kayser is bullish, encouraged by what he sees in the marketplace. “I think it was last year at IBC, when I walked through the aisles and saw all the 3D playback technology being presented, that it struck me that this is a totally different situation than [the previous big surge in 3D production] 50 years ago,” he says. “There are so many technologies converging that allow you to see stereo on the screen that I think it’s here to stay. To prove the point, we’ve got this wonderful little small-budget movie, Dark Country. It shows that the tools have evolved to a point where even if you don’t have a big budget, you can work with them.”

Hays is a little more cautious. “I think a number of 3D features will be made this year in the $10 to $20 million range,” he says. “I’m not sure if it’s possible to repeat it on the budget Dark Country was made on. Bob Johnston, one of our executive producers, works really hard with people on a scenario to transition from a 2D budget to a 3D budget with the tools we have. What it takes to create not only a 2D movie—because, obviously, every 3D movie can be a 2D movie—but also the 3D movie as well, with all the potential additional revenue associated with that.

“When people see good live-action movies in 3D in the theater, it’s really going to be a turning point. But the material is not out there. Journey to the Center of the Earth 3D, the first 3D digital live-action feature, is coming out in July, and hopefully Dark Country will come out later this year. But 3D so far has been mostly for children. And when adults go and see that they can enjoy a movie in 3D even better than they can a standard 2D movie, that’s when you’re going to see an additional push in the market.”

By Bryant Frazer, StudioDaily