Hollywood Finds Depth: 3-D Boom

"Hollywood is done with being shallow; it wants depth. 3-D depth, to be specific.

3-D filmmaking - the process of mounting two simultaneously filming cameras side-by-side and then superimposing the images on top of each other to create the illusion of depth - is looking at a comeback. At the recent National Association of Broadcasters show in Last Vegas, everyone from the major film studios to music-video production companies to the National Basketball Association displayed eye-popping examples of films and clips capturing artificial depth. Silly 3-D glasses were actually kind of hip over in the high desert.

The technology is far from new. 3-D was first tried in the 1950s, and later popped up in horror films starring Vincent Price. Even auteur Andy Warhol tried his hand at the tools. But filming even brief segments with two cameras was cumbersome and expensive; the cost of shooting an entire 3-D film was prohibitive for most projects.

But now, the alchemic effects of digital technology and the Internet are changing the business of adding dimensions. Miley Cyrus and U2 have put out successful 3-D films; kids love 'em. Big-name directors including James Cameron have 3-D projects in the works - the hot rumor in Hollywood is that George Lucas is working to re-release the entire Star Wars saga in 3-D.

The good news for small businesses is that the field is ripe with growing companies developing innovative solutions for 3-D's many challenges. Take Iconix Video Inc. in Goleta, Calif. It has created a nifty, super-small, high-definition imaging system that's going great guns in 3-D.

Company founder Wayne Upton, a video engineer with a background in medical imaging, got the idea for the technology while on assignment with a large Japanese electronics maker in 2003. The company asked him to solve a problem with a high-definition camera. Upton found a solution in about 48 hours, and was so taken by his idea that he decided to start a company based on the discovery.

Who's cashing in on the 3-D boom?
Starting with a $1.2 million investment from venture-capital fund DFJ Frontier and others, Upton and his small company went into production with a high-def camera no bigger than an egg. The unit quickly racked up $1 million in sales based solely on word of mouth. It's easy to see why: The Iconix camera can be mounted just about anywhere - the side of a car, the end of a pool cue - to get once-impossible shots.

Upton's 28-person company has craftily partnered with other small businesses to expand its product line. It uses a clever two-camera mount from 3-D firm 3ality that lets directors precisely calibrate the crucial distance between the cameras. Iconix then uses off-the-shelf hard drives for storage and third-party post-production software to essentially create a complete 3-D system for little up-front investment.

"We believe it is the first end-to-end 3-D production environment that is within reach of just about any professional production company," said Bruce Long, Iconix's CEO.

Image quality in my limited demos was serviceable. A shot from a soaring glider, a flyover of a tropical island, and even a test section from the original Star Wars showed reasonable depth and none of the attendant disorientation - and often, motion sickness - common with 3-D.

But let's be clear here: The technology is no IMAX. The illusion of immersion is much more advanced in other systems. The technology favored by James Cameron, which was developed by cinematographer Vincent Pace, offers significantly more depth and higher image quality, to my eye.

But it's hard to argue with the price. Iconix cameras start at just $6,000 - way less than a tenth the cost of traditional HD cameras, which can run six figures or more.

With Hollywood facing stiff new competition from video games and other formats, hype about 3-D ran deep at the NAB show. Iconix's executives bullishly predicted that within the next few years, at least 30% of movies would be shot in 3-D.

But others, myself included, have more modest expectations.

"I think the hurdle that filmmakers face - and what will always keep this a niche technology - is that it is by definition cumbersome to shoot, even though great lengths have been made to simplify it," said Doug Liman, a Hollywood director and producer whose credits include Swingers, The Bourne Identity and Mr. and Mrs. Smith.

But despite the limitations, even Lyman believes that this time, 3-D is here to stay.

"I do think it will be part of our moviegoing experience in the future," he said, "and we will have one to two 3-D films at the multiplex at any given time."

By Jonathan Blum, CNN Money

Nvidia to Make All Your PC Games 3D (If You So Choose)

"Thanks to a new software driver Nvidia is cooking up, any PC game can be played in 3D, with no extra work on the part of game developers.

Beginning this summer, any PC with an Nvidia graphics processor will have the ability to run a game in normal mode, or in 3D, with the aid of 3D glasses. The software driver will enable the ability to have two views--left eye and right eye--which, at the push of a button, appear blurry and pixelated to the naked eye. When viewed through 3D lenses though, the game pops into three-dimensional mode.

Nvidia's Drew Henry models the working prototype of 3D glasses.

The important part is that game developers won't have to do anything differently. They just continue to make their games the way they want, and Nvidia will take care of the rest. It's just an option for gamers though--it doesn't mean all games have to be three-dimensional.

The challenge for Nvidia is making the glasses widely available at retail, as well as turning the silly-looking lenses into something actually cool and "not as geeky-looking," said Drew Henry, general manager of the company's Media Communications Processor (MCP) group.

And yes, they promise the final product will be much cooler than the glasses pictured above."

By Erica Ogg, CNET News

DreamWorks' Katzenberg Disappointed with 3D Talks

"DreamWorks Animation SKG Chief Executive Jeffrey Katzenberg on Tuesday said he was disappointed with the pace at which movie theater chains were moving to deploy digital and 3-D technology.

"Things haven't progressed as well as I had hoped," Katzenberg told analysts on a quarterly conference call.

"I feel as though things have dragged along, and it's been pretty disappointing," said Katzenberg, a huge proponent of 3-D films, who has pledged to make all future films in 3-D at an incremental cost of $15 million per film. For Dreamworks, which makes about two films a year, that commitment to 3-D amounts to about $30 million per year.

Katzenberg on the conference call said he still believed DreamWorks will see a good return on its investment based on projected ticket prices and the number of 3-D screens he is certain will be in the market by the time his studio's first 3-D film, "Monsters vs. Aliens," is released in spring 2009.

"But whether or not it achieves the fullest potential and outside goals I've set for ourselves and challenged exhibition with, is the thing up for grabs right now," he said.

We have indicated that we would like to see 5,000 3-D screens domestically by the time we released 'Monsters vs. Aliens,' but we need to make sure that major theaters chains are committed to getting these screens in the next 30 days or it's unlikely we will get all 5,000 screens," said Lew Coleman, Chief Financial Officer for DreamWorks in an interview.

Katzenberg had hoped by now the Digital Cinema Implementation Partners, owned by Regal Entertainment Group, Cinemark Holdings Inc and AMC Entertainment Inc had reached a $1.1 billion financing deal with Hollywood studios to deploy cinema digital technology. Once outfitted with digital projectors, theaters can then add 3-D technology.

The DCIP first hoped to clinch the deal by the fourth quarter of 2007, but various issues prolonged the talks. Travis Reid, chief executive of DCIP, last month said he hoped to conclude a deal in the second quarter 2008. Reid declined comment on Tuesday.

About 4,000 of the 37,000 cinema screens in the United States are digitally equipped, while a little more than 1,000 screens have 3-D capability. Some analysts have cited concerns there will not be enough 3-D screens to accommodate all the upcoming 3-D titles from Dreamworks and other studios due out in coming years. Many in Hollywood look to the success of the 3-D concert movie, "Hannah Montana/Miley Cyrus: Best of Both Worlds Concert Tour," which grossed nearly $30 million in its opening weekend as a template for the future. Tickets sold for $15.

Once the DCIP digital upgrade starts, it is expected to take about three years to complete the upgrade of the 14,000 screens of those theater chains involved.

"In terms of getting the big three on board and actively moving forward, I feel as though things have dragged along, and it's been pretty disappointing," Katzenberg said, referring to the movie chains.

"If these guys don't get their act together very quickly in the next 30 days, they're not going to be able to achieve that goal and it will start to deteriorate quickly. Every week that goes by will be several hundred less screens that will manage to be rolled out in the timeframe," he said.

By Sue Zeidler, Reuters

P2Soft Speeds and Simplifies P2 Files

Opencube Technologies released Version 2.0 of P2Soft HD at the NAB Show. P2Soft HD from is designed to simplify ingest and management of SD and HD MXF P2 files.

It can automatically launch and begin ingest as soon as a P2 card is inserted into the drive. A preview function allows for content viewing before ingest is complete.

The software detects production systems and alerts media asset management systems to update the database. P2Soft HD also optimizes metadata management, both metadata recorded to the P2 card and added during viewing.

Source: TV Technology

Barco introduces High Definition Reference Monitor for Broadcast and Post Production Environments

Barco, a leader in display and visualization solutions for selected professional markets, announces the RHDM-2301, its new High Definition Reference Monitor for broadcast professionals. With the RHDM-2301 Barco presents the first true grade-1 LCD display that ensures broadcast staff in production and post-production environments of the highest level of color stability and accuracy for their video reference tasks.

Studio professionals need the highest level of image quality and stability to do their job. Up to now, no LCD based monitor was able to match a CRT based reference monitor. Today Barco presents the first true grade-1 LCD display that can compete with a CRT monitor. Barco’s RHDM-2301 with patent pending technology, presents the highest level of color accuracy, image stability and motion handling. It features some of the most advanced LCD technologies, such as a 10-bit panel and real time color stabilized LED backlights.

Grade 1 is a standard formally defined by the EBU that sets requirements for contrast, black levels, and brightness. In the process of real-time color judging, Barco’s RHDM-2301 provides continuous color stability, perfect representation of grayscales, deep darks and pastel tones. In addition, Barco's RHDM-2301 display is a perfect tool for monitoring of fast moving video. Thanks to the high-speed 120 Hz panel and scrolling LED backlight technology, it avoids motion blur and judder. This results in excellent motion handling and lifelike motion scenes which are reminiscent of how CRT reference monitors used to show moving video.

With the RHDM-2301, Barco presents a reliable LCD based reference for camera scene alignment and lighting staff. Two video channels can be positioned in split screen, allowing a very detailed colorimetric judgment of multiple input sources. The RHDM-2301 does not need frequent maintenance, as it features advanced calibration sensor technology, integrated into the display’s electronics. In OB vans and transmission monitoring environments, RHDM-2301 users significantly benefit from the display’s robustness and system stability. The reference monitor can easily withstand temperature cycling conditions, so that frequent calibration is not necessary. It also features selectable video speed handling to judge fast moving scenes on the same monitor.

The display offers a de-interlace and inter-frame handling option which allows to validate multiple application formats. Above any other CRT design, the RHDM-2301 uses an accurate wide gamut display approach with selectable color spaces. This “master reference” design allows multiple users to exchange content between sources, editing suites, digital intermediate generation and delivery situations based on one single common reference. As this Reference is accurate and stabilized over time and place, users will enjoy the benefits of color consistency combined with a CRT look alike video representation.

The RHDM-2301 will be available for sales towards the end of 2008.

Source: Barco

NAB 2008 - Take-aways

"I spent NAB 2008 walking around, looking for gear for our production company and getting a feel for where things are headed in general. I took away several strong impressions about where the industry is going—as well as a couple of interesting toys.

The Format Wars ain’t over
With the advent of DV-based standard definition formats, we thought that the format wars would finally wind down. But in 2008’s HD ferment, the opposite is true: there are more formats—more codecs, more wrappers, more storage media—than ever before:

- Codecs: JPEG2000 (Cineform, REDCODE, GVG Infinity), MPEG-2 (HDV, XDCAM, various servers; ATSC broadcasting), MPEG-4 (AVC-Intra, AVCHD, HDCAM-SR), DV (DVCPROHD), HDCAM, ProRes422, DNxHD, just to name a few. And even within a codec family, so many variations: in 1080-line HDV alone, you’ve got the original 1080i , Canon 24F, Sony 24P,...

- Wrappers: MXF (both OP atom and OP1a, wrapping MPEG-2 and DVCPROHD), MP4 (wrapping MPEG-2 essence on the Sony EX1 and EX3 camcorders), MPEG-2 program streams and transport streams (HDV), DPX, OpenEXR, QuickTime, AVI...

- Storage: tape, optical disk, hard disk, or solid-state; sometimes with three of these options available on a single camera (JVC HD200 with HDV tape, plus the new add-on SDHC/HDD recorder).

Those are just the quick examples I pulled out of thin air; it’s not meant to be a comprehensive list by any means. And there’s no indication that this multiplicity of choices is likely to thin out any time soon.

Solid State Storage is where the future is
Every major camera vendor now offers solid-state storage options:

- Panasonic: P2 (DVCPRO and AVC-Intra) and SHDC (AVCHD). All Panasonic’s new camcorders use solid-state recording. The only new product with any tape capability is the HVX200A, which keeps standard-def DV tape for those who need to keep using that workflow.

- Sony: XDCAM EX on SxS cards; HDV on CF cards. Sony even showed an SxS studio deck.

- JVC: an add-on module for HDV recording on SDHC cards (also with a 10-hour hard disk).

- Ikegami: GFCAM, a solid-state flashpack version of EditCam.

- Canon: SDHC cards or internal memory only on their tiny AVCHD camcorders.

- RED: CF cards.

One of the hottest items at the show was Convergent Design’s Flash XDR recorder, recording MPEG-2 to CF cards. Toshiba and SeaChange showed solid-state video servers. Codex Digital‘s portable cine recorder can be loaded with either hard disk or flash memory recording packs.

We’re not to the point where the solid-state card is cheap enough to be the archival medium—shoot on it, hand it to the client at the end of the day, shelve it for future retrieval—but with IBM’s recent developments in spintronics that day may not be far off.

Resolution, shmesolution!
There are HD and digital cine cameras available with anywhere from 960x540 photosites (several of the Panasonics) to 5760x2160 (Sony F35). In between you’ve got the standard 1280x720 and 1920x1080, as well as oddballs like 1440x810 (Sony HVRZ7, HVR-S270) or 2880x2160 (Arri D21). RED alone has announced three different sensor sizes: 3K, 4K, or 5K.

Add to that the two standard HD resolutions, 2K or 4K digital cinema release formats, SD deliverables in both 480 and 576-line formats (which could be either 4x3 or 16x9), recording resolutions (for 1080-line HD, you can choose from 1280, 1440, or 1920-pixel-per-line sampling depending on your recording format), not to mention the different screen resolutions someone might view your content on, and it’s pretty clear that the good old days of “shoot 525/59.94, edit 525/59.94, deliver 525/59.94, view 525/59.94” are gone forever.

GigE shared storage
At least two vendors were selling HD-capable shared storage for editing with only Gigabit Ethernet connections: EditShare and Laird Telemedia’s LairdShareHD. With high-quality editing codecs like ProRes422 HQ, Canopus HQ, DNxHD, Dirac, and Cineform, GigE has the bandwidth when the network layer is properly tuned. Both Laird and EditShare use carefully optimized kernels to ensure that the maximum possible throughput is maintained, allowing responsive editing of multiple HD streams without the complexity of Fibre Channel SANs.

If that’s not studly enough for you, CalDigit has storage systems that extend the PCIe bus all the way out to the external drive array through PCIe switching fabrics. Freaky, geeky fun! Fast, too. They even have ExpressCard34 adapters so your laptops can access the same storage.

RED is here to stay
By showing mockups of the $3,000, 3K “Scarlet” and the $40,000, 5k “Epic” cameras, RED Digital Cinema Camera Company has indicated that they’re here to stay: the RED ONE isn’t a one-off. RED also showed a 4K optical disk player and improved lenses, and quietly released a beta-version Log & Transfer plugin for Final Cut Pro, allowing one-step ingest of RED clips directly to an FCP bin. You can even apply looks created in RED Alert! during import; it’s not quite the SpeedGrade workflow just yet, but it’s getting closer."

By Adam Wilt, ProVideo Coalition

NAB2008: The Return of Stereoscopic 3-D

"Avid and Apple may have been conspicuously absent from NAB earlier this month, and attendance may have dipped from 111,028 to 105,259, but the 2008 edition of the Las Vegas confab was a lot more interesting to cover in many ways. First, there were a few exciting new tools introduced that generated a lot of buzz, including eyeon Software's Generation, a new collaboration app for conforming, editing, compositing, annotation, versioning and render management, and the HP DreamColor Technology computer display, which was previewed during a keynote address featuring DreamWorks Animation CEO Jeffrey Katzenberg (via video feed), DreamWorks Animation Chairman Roger Enrico and HP's EVP Todd Bradley.

Touting an open architecture, Generation is designed to ease vfx workflow collaboration between supervisors and production leads and allows senior artists to share in the process as well. Fully compatible with eyeon's products, Fusion, Rotation and Vision, Generation provides the simultaneous play of multiple versions while experimenting with multiple cuts to compare various projects. Shot refinements, from storyboard through to animatics to finished shots, are all tracked and versioned visually with realtime playback and commenting with workstation/laptops. Generation will be available this summer.

The HP DreamColor display "provides accurate, predictable color and a simple color management process to assure vision-to-production color consistency in a widescreen liquid crystal display (LCD). The display generates the industry's first combination of true 30-bit color - enabling a range of 1 billion colors -- in an LED-backlit LCD at a fraction of the cost of most high-end, studio-quality LCD displays.

"For decades, storytellers have struggled to manage color in an accurate and consistent manner," said Katzenberg. "Quite simply, when we make a movie about a big, green ogre, our concern is that our ogre is the same color of green throughout the film. HP has truly changed the game with its new display, giving DreamWorks Animation full visual fidelity across the board for the first time."

More details about DreamColor to come...

Plus, there was the usual excitement surrounding new digital cameras, including Sony's professional F35 and smaller PMW-EX3 as well as the new RED 5K Epic, 4K Red Ray and 3K Scarlett.

Speaking of which, Boxx Technologies offered redBoxx, a new solution that allows film directors, colorists and vfx experts to view 4K footage shot with the RED ONE digital cinema camera at full quality 2K resolution in realtime with full debayering, without having to use expensive and time consuming scanning and transcoding. redBoxx is based on a specially engineered version of the 3DBoxx workstation and includes Assimilate's new SCRATCH CINE software package for Digital Intermediate work. redBoxx is designed to work only with REDCODE .R3D files. The redBoxx arguably completes the puzzle of modern digital workflows by allowing filmmakers to create highly polished results faster, easier and at lower cost than current standard DI technology can offer.

In addition, the Albuquerque-based Cinnafilm unveiled the HD 1 realtime film look system, a potential breakthrough in applying a rich film look to nearly any digital video source while maintaining visual integrity. Still in development, the HD 1 is an advanced vfx solution utilizing a patent-pending GPU parallel processing engine that streamlines transfer and rendering procedures in a quick and cost-efficient manner. The brainchild of Cinnafilm Founder Lance Maurer, he has enlisted the help of Los Alamos National Labs in the engineering of the HD 1.

Most of the interest and excitement at NAB, however, concerned the return of stereoscopic 3-D and its potential industry boost, despite a note of caution issued by NATO President John Fithian about a looming "train wreck" if studios and theater owners don't come to a quick settlement over digital cinema fees, which could derail agreements to install 22,000 digital 3-D screens for next year.

Nevertheless, NAB introduced a new Content Theater in the central hall devoted to panel discussions about 3-D and other topics, including globalizing Bollywood, how CG technologies meet classic techniques for character animation (presented in association with the VES), case studies about Horton Hears a Who! and Pushing Daisies and broadband media workflow.

But a whole day was devoted to lively discussions about all things stereoscopic, including one about the art of 3-D. Panelists for this one included Peter Anderson, 3-D DP and vfx supervisor (U2 3D); Eric Brevig, vfx supervisor and director of Journey to the Center of the Earth 3D; Phil McNally, global stereoscopic supervisor, DreamWorks Animation; Vince Pace, CEO, Pace and 2nd unit director of photography on Avatar; Rick Rothschild, SVP, Walt Disney Imagineering; and Sean Phillips, DP, vfx supervisor and director of Sea Monsters.

Digital cinema and improved glasses have made the stereoscopic experience more comfortable and immersive. Learning how to compose shots while avoiding eyestrain during the viewing experience remain key, understanding how to shoot and edit are also vital and utilizing an efficient workflow is still underway.

"It is the Mona Lisa effect -- it is more personal," offered Anderson.

"[3-D creates] an emotional bond," asserted Pace. "It will make it transcend effects films. It shows off the acting craft and athleticism that complements the storytelling," he added in a veiled reference to Avatar.

"The film [Journey to the Center of the Earth 3D] was primarily dramatic," Brevig said." I could put the camera anywhere. In 3-D, it gives you something extra. I could sit on a close-up longer. And longer shots were more engrossing."

In fact, 3-D may one day alter storytelling. Rothschild suggested that up and coming filmmakers re-learn theatrical blocking to understand how to take advantage of depth of field and slower editing principles.

In a subsequent case study on Journey to the Center of the Earth 3D, which opens July 11 from New Line and Walden, Brevig added that he needed to watch 3-D dailies and ID eyestrain. Editor and 3-D consultant Ed Marsh said it was quite an experience in storytelling and that he is looking forward to Avid's reported development of 3-D editing tools, which includes, among other things, cutting in 2-D with info for both eyes and viewing 3-D on a display.

Brevig stressed that his strategy included the following: that focus be imaged in three dimensional space, to avoid eyestrain and to take volume in a comfortable range by reducing the distance between lenses.

Meanwhile, there was a host of new tools to help facilitate better stereoscopic 3-D workflows:

Quantel showed off its new stereoscopic 3-D post systems, which provide "a true realtime end-to-end 3-D post process." The new stereo 3-D toolset is available as an option on all new Pablo 4K, iQ4 and Max 4K systems. Additionally all existing Pablo 4K or iQ systems can upgrade to stereoscopic 3-D, providing a "start to finish" stereo workflow, including previsualisation, editing, vfx, color correction, trailers and mastering. Stereo projects can now take place "in context" without the need for guesswork. The Quantel Stereoscopic 3-D Option for Pablo iQ and Max has the ability to playout and manipulate two simultaneous streams of HD or 2K in sync and without rendering. To address the needs of more cost sensitive post houses Quantel has also launched a new dedicated stereoscopic post-production workstation called Sid. Sid comes in two configurations: as a full stereo online system and also as a straightforward viewing, conform and mastering system.

The Foundry, meanwhile, previewed its own efforts into stereoscopic 3-D workfow as part of the new Nuke 5. There are tools for left and right eye image viewing as well as the ability to view the composite anaglyphically as a 3-D preview and to render anaglyphically into OpenEXR files.

Iconix has expanded, both organically and via acquisitions, from a camera company to a fully integrated service provider. In addition to tools for broadcast and digital cinema, Iconix now offers an end-to-end stereoscopic 3-D pipeline, including cameras, rigs, on set storage and post solutions. The newest generation of Iconix cameras, the Studio 2K, is "the only POV camera system capable of 2K digital cinema outputs" and is ideally suited for stereo 3-D applications. According to Iconix CEO Bruce Long, a former vfx producer, "3-D is an outgrowth of vfx, animation and greenscreen shooting. It's an ideal time to move to 3-D rigs. There's a paucity of solutions. Quantel showed up. We deploy a stereoscopic pipeline using off-the-shelf hardware and proprietary software. Ours is a lower entry for filmmakers and the first end-to-end stereo solution from camera, rig and storage device."

The Polecam head (left) is a 3-D stereo rig. The Studio2K camera (right) is at the acquisition end of the Iconix 3-D stereo pipeline.

Autodesk, while demo'ing its full range of solutions, announced at its User's Group meeting that it will unveil its highly anticipated stereoscopic workflow strategy at SIGGRAPH in LA. Sebastian Sylwan, Sr. Industry Manager, for the Film, Media & Ent. division, who spearheads the stereoscopic strategy, said Autodesk intends to take a holistic approach to the 3-D pipeline. "Stereo blurs the line further and there are a lot of elements that need to fit properly, thanks to single projectors and camera rigs. Stereo is not a grading problem but is part of the pipeline. The grammar of stereo is the big question. How will this change storytelling?," he asked excitedly."

By Bill Desowitz, VFXWorld

3D Image Sensors - What Are They?

"When I first saw the announcement of 3D image sensor chips cross the newswire, I must admit my first reaction was a bit dismissive. Mistakenly, I assumed this was some kind of gimmick that paired two conventional image sensors a few inches apart to create stereoscopic images, something that has been readily doable with two separate, independent image sensors for years. "A more convenient package," I thought to myself. I couldn't have been more wrong.

What Canesta, the company behind these sensors, and Tower Semiconductor, the foundry have announced are true depth-perceiving image sensor chips that work on a totally different principle than ordinary image sensors and ordinary stereoscopic 3D. The 3D image is not created by stereo vision -- essentially two separate 2D images, the way our eyes work -- but by actually sensing the depth to various objects in the scene using a single image sensor. It's kind of like light wave based radar on a pixel-by-pixel basis."
By Cliff Roth, Video Imaging DesignLine

CanestaVision Chips
Most people understand that light takes a finite time to travel between two points -- that photons of light from two different stars, for example, may have started their journeys years, or even millennia apart. Since light travels essentially at a constant speed, if you know the time, you can calculate the distance.

The light illuminating each individual pixel in an image sensor comes from a different feature in the scene being viewed. Canesta recognized that if you could determine the amount of time that light takes to reach each pixel, you then could calculate with certainty the exact distance to that feature. In other words, you could develop a three-dimensional "relief" map of the surfaces in the scene. In three dimensions, objects previously indistinguishable from the background, for example, metaphorically "pop" out. For a broad class of applications, this proves extremely helpful in reducing the mathematical and physical complexity that has plagued computer vision applications from the start.

In a recently-granted U.S. patent, Canesta describes several of its inventions for "timing" the travel time of light to a unique, new class of low-cost sensor chips.

Fundamentally, the chips work in a manner similar to radar, where the distance to remote objects is calculated by measuring the time it takes an electronic burst of radio waves to make the round trip from a transmitting antenna to a reflective object (like a metal airplane) and back. In the case of these chips, however, a burst of unobtrusive light is transmitted instead.

The chips, which are not fooled by ambient light, either then time the duration it takes the pulse to reflect back to each pixel, using high speed, on-chip timers in one method, or simply count the number of returning photons, an indirect measure of the distance, in another.

In either case, the result is an array of "distances" that provides a mathematically accurate, dynamic "relief" map of the surfaces being imaged. The image and distance information is then handed off to an on-chip processor running Canesta's proprietary imaging software that further refines the 3-D representation before sending it off chip to the OEM application.

Source: Canesta

Codex Digital Announces Data Dupport for ARRIFLEX D-21

Codex Digital announced that it is adding data support for ARRI’s new ARRIFLEX D-21, as well as future ARRI cameras, to its HD, 2K and 4K systems.

The Codex Recorder and Codex Portable field recorder currently support the ARRIFLEX D-20 in data mode, unleashing the potential of this camera to operate beyond HD. As an approved supporter of ARRI’s ARRIRAW T-Link, Codex Digital’s uncompressed and JPEG2000 data recording will let users record the D-21’s highest possible resolution and dynamic range and deliver captured material directly into a post-production chain. The material is recorded complete with all available metadata and delivered in any format required for effects and finishing.

Codex recording systems can capture and store the raw, unprocessed data from ARRI’s 4:3 sensor, making maximum use of its 2880 x 2160 resolution and 12-bit bit depth, with real-time playback, including unsqueezed anamorphic material.

Source: BroadcastEngineering

Snell & Wilcox Creates File-Focused Spinoff AmberFin

Snell & Wilcox was at the 2008 NAB Show last week not only to hawk its video-processing and switcher products, but also to make a pitch for AmberFin, a new company that will market iCR, Snell’s software-based system for content repurposing, and other new software products aimed at file-based workflows.

AmberFin (the name has no special meaning) will be run by Snell CEO Simon Derry and will remain part of the Snell & Wilcox Group, which is based in Basingstoke, United Kingdom, and owned by European venture capital firm Advent Venture Partners.

The intellectual property behind iCR will also be available to both companies. But AmberFin will have a dedicated engineering team, including Snell principal research-and-development engineer and MXF (material-exchange format) expert Bruce Devlin, and soon a dedicated sales force, as well.

Given the success of iCR -- which is used by large customers such as Warner Bros., Sony Pictures, Ascent Media, Technicolor and British Telecommunications to repurpose content for distribution on multiple platforms -- it may seem strange to separate a growth segment from the main Snell & Wilcox business unless plans for a outright sale or initial public offering are underway.

But according to Derry, no such plans are in place for AmberFin. Instead, he said, creating a separate company was the most efficient way to market and develop file-based software systems, which have a different development cycle and business model than the real-time video hardware Snell has been selling for decades.

“The software and services business model is different from the rest of the Snell & Wilcox product line,” Derry added. “We’re working with companies like HP [Hewlett-Packard] and IBM. There are different price strategies, and we need to appeal to a broader audience than our existing customers.”

By Glen Dickson, Broadcasting & Cable

The Broader Issue: Digital 3D Part 3 & 4

"For the third and fourth programmes in this series of discussion programmes on stereoscopic 3D, Bill Scanlon turns his attention to visual science and how our brains perceive objects in the physical world and in the worlds created by stereoscopic filmmakers.

Bill visited London City University’s Applied Visual Sciences department to talk to Michael Morgan, a world renowned Professor of Visual Psychophysics who leads the department’s research into visual perception. For his book, ‘The Space Between Our Ears: How Our Brain Represents Visual Space’, Michael won a Wellcome Trust prize.

They were joined by Neil Harris, who many of you will recognise as the pioneering designer of the Lightworks non linear digital editing system, for which he won a Scientific and Technical Oscar. He was also the architect of the Sohonet private media exchange network and one of the founders of the Computer Film Company, which is now part of Framestore|CFC. Neil is currently developing software to semi-automate the process of converting 2D content into stereoscopic 3D."

Download Part 3
Download Part 4

Source: The Broader Issue

Sony Acquires Gracenote

"Sony Corporation of America announced on Tuesday that it is acquiring Gracenote, the provider of music metadata and recognition technologies, for US $260 Million in cash plus other unspecified consideration. Sony intends to keep Gracenote as a separate business unit. The deal is expected to close by the end of May.

This move follows fairly quickly after Macrovision's acquisition last November of Gracenote's competitor All Media Guide (AMG). Those two companies -- and a third, Muze -- compete to offer the most complete and most accessible databases of music metadata; they seek to integrate their data into media player devices, PC "jukebox" software, and online music services. Gracenote also offers music fingerprinting technology that it acquired from Philips in 2005, customers for which include the social networking site imeem.

The acquisition is a smart one for Sony on many levels. First, it is a hedge against a future -- which we view as likely -- in which consumers will no longer pay directly for music recordings. Rich, pervasive, high-quality metadata is going to be vitally important in the ability to build music services that consumers will actually pay for. In acquiring Gracenote, Sony is doing more than acknowledging the importance of metadata: it is positioning itself to succeed in garnering revenue from music services, where it has very visibly failed in the past.

Even better, the deal will put Sony in a position of control over key information that its competitors will need to build their own products and services. In other words, it will be able to charge companies like Samsung, Sandisk, and possibly even Apple (iTunes uses Gracenote's database) a "toll" for accessing metadata and music recognition services.

Unlike the Macrovision-AMG deal, this one involves one of the leaders in media fingerprinting technology: Gracenote and Audible Magic are the two major players in the music market. This ought to be both good and bad news for other fingerprinting technology vendors, especially the several that are appearing in the video market -- startups like Vobile, Zeitera, iPharro, and Advestigo as well as bigger players like Philips and Thomson. The good news is that the value of this technology is being recognized; the not-as-good news is that the market will demand synergies with other technologies, data, or services in order to realize that value. Fingerprinting vendors will need to find such synergies as the opportunities for new types of online media services become clearer."

By Bill Rosenblatt, DRM Watch

French Chain Starts to Go Digital

"The first fully digital multiplex has opened in France, where the groundbreaking D-cinema deal between Gallic exhib chain Circuit George Raymond and Arts Alliance Media, Europe’s leading digital cinema distribution outfit, is moving forward at a great pace. CGR’s 12-screen flagship multiplex in La Rochelle went 100% digital this month and London-based AAM promises 100 screens on the CGR circuit will be converted to digital by July.

This is the first stage in the rollout of AAM’s digital makeover of all CGR’s 400 screens. Under the deal, AAM procure, service and maintain all digital systems for CGR, France’s third-biggest chain, which specializes in multiplexes in midsize towns.

“La Rochelle is a benchmark for the digital cinema transition in France and in Europe. Thanks to CGR, the French cinema industry has now a great opportunity to witness and understand the operations of a fully digital cinema,” commented Gwendal Auffret, AAM’s managing director of digital cinema.

"This will open new perspectives to our group in terms of 3-D and alternative content, as well as faster and more flexible programming and increased efficiency,” enthused Jocelyn Bouyssy, CEO, CGR Cinemas.

When the two companies pacted last November, it was the first time AAM had done a virtual print fee deal with an exhibitor. AAM has signed digital cinema deployment agreements with five studios: 20th Century Fox, Universal Pictures Intl., Paramount Pictures Intl., Sony Pictures Releasing Intl. and Walt Disney Studios Intl.

AAM is in active negotiations for further deployment agreements with other distribs and exhibs and announcements are expected shortly. Firm is also increasingly focusing on sourcing, managing and promoting alternative content for cinemas."

By Archie Thomas, Variety

Samsung Claims First 3D-ready Plasma HDTV

"Samsung Electronics, a worldwide leader in innovation and HDTV technology, has introduced a new way to watch movies and games in HD with the world's first 3-D Ready Plasma HDTVs. The new Series 4 450 Plasma provides consumers with an unmatched, interactive 3-D experience, enabling users to view a wide range of content, including movies, video games and Web content. Beyond 3-D functionality, the Series 4 450 provides a superior HD 720p picture for traditional home entertainment viewing.

According to John Revie, senior vice president of Visual Display Marketing at Samsung Electronics, "The Samsung Series 4 450 Plasma responds to the demands of consumers looking for new and innovative ways to enjoy multimedia entertainment at home. The 3-D capability of these new displays will allow our users to enjoy the broad range of exciting 3-D content already available, as well as the large volume of 3-D entertainment slated to become available in the near future."

The Series 4 450 3-D Ready Plasma HDTV
When combined with an appropriate PC and 3-D Accessory Kit (both sold separately), the Series 4 450 Plasma can showcase movies, games and more like never before in clear, lifelike 3-D, right at home. Currently, 3-D technology has already begun to draw serious interest from the film industry. A number of animated and live-action 3-D titles are already available to consumers and major Hollywood directors are currently working on a growing number of 3-D films. Emphasis on the new format has led to an expected rise in the number of 3-D-capable cinemas across the country, and thus an expected increase in 3-D titles available for home viewing. The interest extends to the PC gaming community as well, with 3-D games and content expected from a variety of renowned software makers.

The Series 4 450 Plasma also features crisp HD 720p resolution, a dynamic contrast ratio of up to 1,000,000:1 for deep blacks and bright whites, and rich, 18-bit color processing for stunning picture quality time and again. Samsung's FilterBright anti-glare screen technology works to maintain a clear picture regardless of a room's external lighting. Three different enhancement modes (Sports, Cinema and Game) improve audio and video processing of different content, making the Series 4 450 Series a perfect choice for the passionate movie buff, the most enthusiastic sports fan, and the hardcore gamer.

Hidden, down-firing speakers combine with SRS TruSurround XT processing to provide the Series 4 450 Series an added touch of stylish design while creating deeper, richer sound levels.

With user convenience and home theater versatility in mind, the 450 Series comes equipped with three (1 side, 2 rear) HDMI-CEC connections (Anynet+) to control the latest home theater products using a single remote. A PC input is available to enjoy the latest content on a larger, theater-like display.

Going beyond picture quality and a variety of features, the Series 4 450 Plasma is designed with a piano black and silver bottom dcor, and a slim bezel finished with soft, rounded edges. The resulting sleek design adds class and grace to most any space, giving consumers a HDTV that is aesthetically appealing, even when it is turned off.

The Series 4 450 3-D Ready Plasma is available now in 42-inch (PN42A450 @ $1,199) and 50-inch (PN50A450 @ $1,699) sizes."

By Dennis Barker,EETimes

Enter a 3D World with Samsung

"I just attended the Samsung A/V Roadshow 2008 press conference where Samsung Philippines introduced an extensive lineup of 34 multi-platform and high-definition products, all designed to satisfy every consumer lifestyle. Samsung featured its latest High-Definition (HD) televisions, home entertainment systems and digital imaging products.

The new products are really impressive, but what really caught my attention was the new Samsung Series 4 plasma HDTV. It's the world's first 3D-ready, flat-panel HDTV on the market. Engineered to be both adaptable and affordable, the Series 4 is the perfect choice for consumers looking to enter the plasma space with a cost-effective, flat-panel HDTV that can do it all.

"Samsung is playing a very important role in the digital convergence age because we have the best quality components and devote considerable resources to R&D, creativity and strong management to navigate and produce some of the world's best high-end consumer digital products," said Spencer Shim, president and CEO of Samsung Electronics Philippines. "By integrating 3D technology into the plasma category, we're offering consumers the benefits of both a flat-panel set and 3D movie viewing and gaming. We're looking forward to continuing to expand our new technologies across all of our HDTV categories."

The Series 4 comes packaged in a lustrous, piano-black exterior, and features first-class components and advanced features typical of premium models. Built-in entertainment modes automatically optimize image and audio quality to the content being viewed-—a must-have for the avid gamer, sports fan or movie buff. Further enhancing the Series' viewing experience is the FilterBright anti-glare technology, which produces deeper blacks and an increased contrast ratio for overall superior picture.

A variety of multimedia devices can be easily connected and used through three HDMI inputs, one of which is conveniently located on the side of the sets. Extending that connectivity even further, Samsung has also included a USB 2.0 port, making it simple to view contents directly from digital devices such as cameras, MP3 players, and thumbdrives. In addition, the enhanced HDMI-CEC TV remote enables consumers to easily control multiple devices connected via the HDMI port.

The Series 4 utilizes technologies and innovations previously available only in 3D-enabled DLP HDTVs to deliver the superb picture quality of a flat-panel TV with enhanced 3D features. The Series 4 is using an advanced software algorithm to eliminate dither noise and false contour lines to reproduce clear images, including fast-moving action scenes. The Series 4's multimedia compatibility via three HDMI and sophisticated picture and sound optimization tools make it ideal for gamers and movie buffs alike.

I tried it by wearing its 3D glasses and the experience was really different. Watching TV/movies will never be the same again. Unfortunately, I get to enjoy it only while it lasted. Trying it and owning one are totally different things."

By Jerry Liao, CNET Asia

Samsung PN42A450

Alex Villafania checks out the Samsung PN42A450, the world's first 3D-enabled plasma TV.

Source: Inquirer

Zecotek Granted Australian Patent for 3D2D Display System

Zecotek Photonics Inc. is pleased to announce that it has been granted acceptance by the Australian Commissioner of Patents for its Real-Time 3D2D Display System technology. Australian Patent Number 2006200812 provides patent protection for 3 dimensional stereoscopic display systems.

"This is a very important step in the development of our 3D display technology as it is recognition that our system is innovative and that our intellectual property can be protected," said Dr. A.F. Zerrouk, Chairman, President, and CEO of Zecotek. "We have approached the challenge of 3D visualization from a completely different direction. Existing 3D displays have significant deficiencies or require viewers to wear special glasses. Our Real-Time 3D2D Display System is the only 3D system that provides multiple viewers with a true three dimensional picture, offering both depth and volume in high resolution over a wide viewing angle."

In November 2007, Zecotek initiated demonstrations, of the 32" commercial prototype of its proprietary and now patented 3D display system, for a select group of representatives of potential industry partners. The Company is also working with Insight Media, a leading publishing and consulting firm focused on the display industry, to finalize a market entry strategy for the 3D display system.

Zecotek's Real-Time 3D2D Display System is a novel, patented display system for images and data and represents a new generation of 3D displays. The 32" commercial prototype is described as multi-user/multi-view with continuous 3D within the viewing angle. With substantial patented innovation, the 3D display delivers full colour, high resolution images in both 3D and 2D mode (greater than 1024x768), has high definition quality images in 2D mode and is compatible with existing rear projection television components. Anteryon BV of the Netherlands manufactures the key screen component of the 3D display.

With its innovative technology, continuous 3D and high resolution, Zecotek's 3D display technology has attracted great interest from a variety of end users for applications in education, computer generated games, advertising, medical imaging, medical training and air traffic control.

Source: FoxBusiness

Moving in Stereo: Display Week Goes 3D with Special Session on 3D in Cinema

The Society for Information Display (SID), the leading global organization dedicated to the advancement of electronic-display technology, today announced a unique 3D technology-focused addition to its program lineup for Display Week 2008, May 18-23, 2008, at the Los Angeles (Calif.) Convention Center. The Special Session on 3D in Cinema, slated for Wednesday, May 21, will feature invited talks from leading experts in the field, on topics spanning the full 3D movie process-from content creation (animation and live action) and editing, to post production and theatrical display.

The session topic is timely, given viewer attendance at 3D versions of recent films such as Beowulf and Hannah Montana/Miley Cyrus: Best of Both Worlds. Both generated record-breaking per-screen averages from 3D locations, delivering eight times the box office revenues of theaters showing the films in standard format. Moreover, conversion to digital cinema technology continues to escalate rapidly, according to U.K.-based Dodona Research. The cinema-focused consulting and research firm estimates that by 2013, half of all cinema screens worldwide will have converted to digital technology from traditional 35mm projectors, whereas about 5 percent of the global total had made the conversion as of late 2007. And, Dodona emphasizes, 3D will serve as the driver for this explosive growth.

To help Display Week attendees gain this in-depth understanding, SID created the 3D in Cinema session, inviting speakers who are working at the cutting edge of contemporary 3D moviemaking to explain how the characteristics of 3D display technologies shape every aspect of the movie creation process. Topics and speakers for the session will include:

- It's Not Real Life: Stereoscopic Content Creation - Phil McNally, DreamWorks Animation

- Adapting "3D" CG Films for "3D" Presentation: The Technique and Technology - Rob Engle, Sony Pictures Imageworks

- Stereoscopic Live Action: Content Capture and Post Production - Steve Schklair, 3ality Digital Systems

- Post Production for Stereoscopic Movies - Norman Rouse, Quantel

- Making 3D An Integral Part of Today's Cinema Experience: A Pragmatic Approach - Jeff McNall, Dolby Laboratories

- 3D Exhibition in the Digital Age: Bringing a New Dimension to Entertainment - Rod Archer, RealD

Throughout the presentations, session attendees will be treated to clips of such 3D features as Beowulf, Chicken Little, Meet the Robinsons, Monster House, The Polar Express and U2 3D. These and other 3D pictures have each played a part in helping digital cinema continue to move to the next level. The 3D in Cinema special session will kick off with a luncheon keynote address by Andy Hendrickson, VP of technology for Walt Disney Animation Studios, who will delve into the evolution of display technology and the various display-related challenges and opportunities specific to the entertainment industry.

The 3D in Cinema special session will be held in the L.A. Convention Center on Wednesday, May 21, from 2:15 p.m. to 5:15 p.m., with a question-and-answer period to follow. The cost to attend is included in the fee for the Symposium technical program, which features additional sessions on 3D display technologies and applications ("3D Applications and Measurement Techniques" from 3:40 p.m. to 5:00 p.m. on Tuesday, May 20; "Novel 3D Displays" and "Stereoscopic Displays" from 9:00 a.m. to 12:00 p.m. on the morning of the Special Session; and "3D Integral Imaging and Autostereoscopic Displays" from 10:40 a.m. to 12:00 p.m. on Thursday, May 22).

For those not attending the full Symposium, the session registration cost is $100. More information, including synopses of each talk, is available at www.sid.org/conf/sid2008/program/3d.html.

Source: The Society for Information Display


"The second product that struck me came from a new company Cinnafilm could be revolutionary (and I hate to even use that word). Cinafilm was started by former aerospace engineer Lance Maurer and has as its technical advisor Brad Carvey, who won a technical Emmy as part of the team that developed the Video Toaster. The Cinnafilm HD1 is a black box that provides realtime film look to video. Since more and more people are shooting on video and not going out to film, providing a film look in realtime at a fraction of the cost should be extremely appealing. Users will be able to select from a menu of common film stocks or can start from scratch and create an entirely unique “stock” from a plethora of controls.

The system will be leased for $10,000 a month, and there will be more on the pricing to be relased soon so that the cost goes down over time. Seeing is believing as starting in May Cinnafilm will accept 30-second video clips which they will apply a look to and send if back to you as proof of what they can add to your film. Cinnafilm has its own studio in Albuquerque New Mexico for film finishing.

But that alone is not the revolutionary part. Soon after the June release of HD1 Cinnafilm will be adding the InfiniFRame and InfiniPlane modules to the systems. Infiniframe allows you to create an infinite number of frames between regular 24-frame footage, the new slow-mo footage plays back absolutely smooth with no artifacting. They’d told me about this over the phone before NAB but seeing is believing. And infinite, means infinite. During the demo they played back a 6 second clip that took 20 minutes to play back. InfiniFrame will allow you to create depth-of-field: going into a shot where a lot of the image is in focus, selecting the point to remain in focus and then throwing everything else out of focus in a natural gradual manner. Again not a bad tool to have for HD production and it is all realtime.

Combining these modules Cinnafilm will be developing a third module which will allow for simple 2D to 3D creation. They are working with Los Alamos National Laboratories, which worked on a little thing called the Manhattan Project, on this and also to accurately access their color replication and processor response."

By Matt Armstrong, StudioDaily

The VFX Show: Stereoscopic 3D

Mike Seymour, Mark Christiansen and Ron Brinkman discuss stereoscopic 3D.
Download the podcast

Source: The VFX Show

QubeMaster Xpress

QubeMaster Xpress can convert your slides, animations and videos into ready-to-use Digital Cinema Packages (DCPs) that will work on any digital cinema server. QubeMaster Xpress effortlessly guides you through the process of converting JPEG, TIFF, TGA and BMP images, Flash animations as well as AVI and Quicktime movies into standards compliant DCPs in DCI JPEG2000 and MPEG-2 MXF Interop formats.

System requirements
A desktop or laptop computer with a dual-core processor, 1 GB RAM, a DVD-ROM drive and running the Windows XP Professional or Vista Business or Vista Ultimate operating system.
Coming soon for Apple OS X.

Video Interview with Ted Schilowitz, RED Digital Cinema

"Leader of the Rebellion" Ted Schilowitz introduces the newly announced Scarlet 3K and Epic 5K cameras, plus a new 4K player called RED Ray.
Download the video

Source: Digital Cinema Society

Iconix 3-D Rig

"The Iconix camera is one of the very few HD cameras small enough to position side-by-side in a 3D rig with the normal 2.5” interaxial spacing.

Even the SI-2K Mini 3D rig on Silicon Imaging’s booth used a beamsplitter to get the proper interaxial spacing. The Iconix camera itself (which in use is tethered to a CCU with a multipin cable) would just about fit inside your average prescription pill bottle."

By Adam Wilt, ProVideo Coalition

Watch Björk's 'Wanderlust' in 3-D

"Icelandic songstress Björk never ceases to amaze audiences with her wildly inventive costumes and imaginative visual style. The electro-pop performer's new 3-D video for the song "Wanderlust" is a bewitching homage to an other-worldly era, following Björk as she goes on a mystical voyage and herds prehistoric beasts through a lush wilderness.

Shot entirely in 3-D using custom-built equipment by San Francisco Bay Area filmmakers Encyclopedia Pictura, the seven-minute video took nine months to produce. More than 150 artists, sculptors and interns worked on the project.

Wired.com brings you the exclusive web premiere of the 3-D version of the video along with exclusive behind-the-scenes coverage."

By Jenna Wortham, Wired

Apple Preparing a iPod Visual Head-Display System

On April 17, 2008, the US Patent & Trademark Office published Apple’s patent application titled Head Mounted Display System. Apple’s patent generally relates to head-mounted display systems. More particularly, the present invention relates to improved arrangements for processing and displaying images in a head-mounted display system. Apple’s laser based binocular near eye display system could apply to glasses, goggles, a helmet or other gear not specified. The bottom line is that iPod Shades are on their way!

Head Mounted Display System
Apple’s patent FIG. 8 gets to the heart of the matter. It’s a diagram of a head mounted display apparatus with integrated fiber optic line [point 202] and electrical lines [point 204], in accordance with one embodiment of the present invention.

The fiber optic line is integrated at a first side of the frame and the electrical lines are integrated with a second side of the frame 152. The optical lines 202 are configured to transmit light from a remote light arrangement to the head mounted display apparatus. The electrical lines may for example include data lines for video control signals and audio data and/or power lines for powering the imaging device and other devices of the head mounted display apparatus. In one embodiment, the optical line and electrical line are formed as a lanyard. In this embodiment, the head mounted display apparatus may further include an adjustable clip 206 for adjusting a loop size. As such, the head mounted display apparatus can be better secured to the user’s head, and further be removed while still being retained around the user’s neck.

Binocular Near-Eye Display System
Apple’s patent FIG. 2 below is a simplified diagram of a binocular near eye display system 10, in accordance with one embodiment of the present invention.

The binocular near eye display system 10 is configured with a display unit 11 that displays video information directly in front of a viewers eyes. The display unit 11 is typically mounted on wearable head gear 16 that places the display unit 11 in its correct position relative to a viewer’s eyes. The wearable head gear 16 may for example be embodied as a pair of glasses, goggles, helmets and/or the like. This arrangement is sometimes referred to as head mounted display glasses (or HMD glasses). FIG. 3 is a block diagram of a laser based binocular near eye display system, in accordance with one embodiment of the present invention.

Apple’s patent FIG. 4 shown below, is a block diagram of a laser based binocular near eye display system 100, in accordance with one embodiment of the present invention. The laser based binocular near eye display system 100 may for example correspond to the laser based binocular near eye display system shown and described in FIG. 3.

The laser based binocular near eye display system 100 includes a remote video controller 102, a remote laser light engine 104 and a head mounted display apparatus 106 having a MEMs based imaging device 108 that is in communication with but is physically separated from the remote video controller 102 and laser light engine. During operation, the MEMs imaging device 108 cooperates with the laser light engine under direction from the video controller to create left and right video images associated with a video signal. The left and right video images are transmitted to a display unit 112 capable of making the left and right video images viewable to a user.

The remote video controller is configured to receive a display signal from a display source and divide the image frames of the display signal into left and right images. In one embodiment, this is accomplished by simply duplicating the images frames into two image frames. In other embodiments, additional operations can be performed when dividing the image frames as for example translating the images by some fraction in the horizontal plane such that the left and right images are slightly different. This may be done to create stereoscopic images. The remote video controller is also configured to convert the right and left images into colored pixels and generate synchronized RGB and image control signals for creating each of the colored pixels. The RGB control signals are sent to the remote laser engine and the image control signals are sent in parallel to the MEMs image device 108 located at the head mounted display apparatus. This is generally accomplished using separate data lines 120.

Apple’s patent FIGS. 5 and 6 below are diagrams of a head mounted display apparatus 150, in accordance with one embodiment of the present invention. The head mounted display apparatus 150 may for example correspond to any of those described herein.

Source: MacNN

NAB 3D-D3 (Day Three)

"My last day on the NAB was quite short, but very interesting. I told yesterday was planning to go to listen to Alvin Tofler. I read his book “Powershift” in my early 20's and it taught me so much about the evolution and shapes of the current science-fiction we are now living in. Remember when 2000 sounded like blue neons, electric cars and portable visio-phones ? There were even guys selling us the very idea of “Intelligent Computers”. Only the one that believed them are now dumber than their blueberry.

On the way to the hotel check out, I ran into Chris Ward from Lightspeed Design. After I told him I'll enjoy my right to remain silent until his lawyer gave me a phone call, he delighted me with a few details about what they are working on over there. Want a hint ? Looks at what they do the best and expect to see it even better ! Catchy isn't it ? I should work in the PR, I would sell 3D cell phones.

By the way, I eventually decided I would bypass Alvin to enjoy a little more of Chris. Am I burning my idols or is the casual chat with 3D maniacs now more valuable than the keynote of the world's futurologist ? I wish I had a copy of Tofler's speech.

This was a good move for I met by French accomplice in 3D hunt : Stephane. He tipped me on a booth I would have missed : Evertz. They built a real-time 4:4:4 Dual HD-SDI 3D card with two inputs and three ouputs (7732DVP-3D-HD). This $7K marvel can input a couple of 4:4:4 HD signals, flip the beam-splite'd one, exit both in full res, along with a 3D-encoded control, in side-by-sive, over-under, interlaced or checkerboard. For there's 4 HD-SDI, you could input 4 camera in 4:2:2 and mix them live for Pierre Alio's auto-stereoscopic HDTV.

Evertz 3D booth

The Dolby booth being right away, it was too tempting to play luck, after all it's Las Vegas. Bingo. There was an Infitec color-coding device free-wheeling on a shelve. The coating shows nice on a simple digital picture.

Dolby RGB filter color wheel

From that point on I went to the NHK booth. On the way I stopped by to greet Florian Maier at the P+S Technik booth. He is the CAD guy who help industrialized Alain Derobe's beam splitter rig. Alain is our top-notch stereographer in France, the giant we're ridding the shoulders. While I was chatting with him, Etan, from TDVision came along. Soon we where joined by Marty Brenneis, the dad of Kerner's stereoscopic camera. He frowned at the $15K rig, mentioning the lack of any motorization. I should have asked him about the price of his own opto-electro-mechanical beast. The last figures I was told are around $2M of development cost. These guys lucky enough to fly daily a space shuttle should remember past century when they were pedaling on the way to the sand box. By the way, Marty, I'd love to see you camera and have a sneak view of the 3D Holy Grail in Santa Lucas.

Florian showing the P+S Technik 3D Rig to Ethan from TDVision

The interview with Atsushi Murakami was great. He is the head stereographer at NHK Media Technology. He designed all the equipment used for Scar. We talked about history of electronic 3D, 3D cinematography, future of 3D, all sort of 3D chat.

Atsushi Murakami, Lead Stereographer at NHK Media Technologies

SpectronIQ 3D screens on NHK booth

Then I ran to the airport and flown back to L.A.

NOTE: This small blogging of the NAB 3D let to a couple emails of product managers worrying I did not seems to enjoy their product. Let's make a couple things clear. I'm a F*****g perfectionist and will always put the finger where it hurts. I'm sorry about that, and I apologies for this on a regular basis to my subordinates and relatives. Complains should be addressed to my parents, not to me. If your product really sucks, I just don't mention it. Because I don't like to have to choose between being hypocrite or impolite, and anyone should have a chance to improve before receiving bad publicity. For this blog is free, and delivered for free, I intend to enjoy and enforce my freedom. On the other hand, I swear I'll do my best not to give you bad credits in a serious (ie paid for) publication without double-checking with you. Unless you really screwed the whole earth like Microsoft did with vista. That one was easy."

By Bernard Mendiburu, Digital Stereographer

Hollywood Goes Shopping for 3D Displays

"With 3D cinema production gaining momentum, the need for new 3D tools and equipment is getting more obvious every day. Until now, 3D movies were produced using makeshift solutions. You’d be surprised to know how many pairs of anaglyphic glasses are used right now at Sony Imageworks, DreamWorks Animation, Disney, 3Ality and Pacific FX.

When I was working on Meet The Robinsons 3D at Disney, everything was done in anaglyphic 3D until we screened our dailies in a full-sized Digital 3D Cinema with a Real D Z-Screen. That was also the case in every other post house working on a 3D project. There may be a couple of salvaged CRTs with liquid crystal shutter (LCS) active glasses, plus a number of dual projector passive stereo rigs, but that’s basically it. These displays are shared equipment but they require special servers and file formats, so the feedback loop from making a 3D adjustment in a computer graphics (CG) software to the actual screening can take up to a full day. Do this a few times and your week is gone.

You now understand why the studios’ stereographers were so enthusiastic when Samsung and Mitsubishi released their 3D-RPTV based on TI’s SmoothPicture technology. At last, it was possible to watch 3D with good frame rate, resolution, colorimetry and brightness, without walking to a digital theater. It was even possible to hook up these marvelous TVs to a computer and watch that 3D in real time - almost. The lack of any "checkerboard" display driver led the studios to build dedicated Linux boxes with HD-SDI inputs and DVI outputs to generate, on the fly, the 3D stream needed by their new 3D TVs.

Now that 3D is taking over the whole animation studio, the need for a new generation of 3D displays is even more obvious. Stereographers like me are still waiting for the solution. Based on discussions with some of Hollywood’s top stereographers, here’s the wish list for a perfect post-production 3D display:

1 - The display should be small enough to fit on a regular sized desk. Forget about dual LCD with a splitting mirror, or 60" giant TVs.

2 - If glasses are used, they should be lightweight: one does not want to wear heavy LCS glasses 9am to 5pm.

3 - The display should be 2D compatible to be able to read the GUI around the 3D picture. Here, some kind of 2D compatibility is needed. Anyone who ever tried to read a text box where each eye sees half of the letters’ pixels know how painful it is, leading to a reversion to anaglyphic 3D. The ability to switch just an area of the screen to 3D is a key function.

4 - The display should be multi-viewer compatible to be shared with colleagues or with the department supervisor when commenting on 3D settings. Forget about single-user head-tracking auto-stereoscopic screens.

5 - The 3D format used by the display should be straightforward enough to be implemented in an OpenGL wrapper.

Ultimately, there are not that many candidate technologies, and they usually fail on classic issues pertaining to price, availability or image quality.

The other 3D display solutions needed by 3D movie makers relate to on-set visual control and screen size simulation. On set, it is important to have a visual control of the 3D image. No director of photography (DP) will shoot color without a color monitor. So, why should a stereographer work with no 3D feedback? Some kind of HMD or stereoscopic viewfinder would fit the bill. If you’re currently designing one, please include a couple buttons. One button should set the virtual screen size at, say, 30 to 60 feet wide. Another button would set the virtual screen distance at, say, 20 to 80 feet.

At this year’s edition of the NAB Show, expect to see 3D camera rigs, 3D edit suites, 3D visual effect solutions, 3D digital intermediates (DI) and a fair amount of 3D footage. I bet you’ll see all of this on polarized projection or 3D RPTVs. Like other visitors, I’ll be looking for that simple, small, cute and efficient 3D display that will, at last, replace my antiquated CRT and its bulky active glasses.

Look for lots of coverage from NAB related to both 3D and 2D in future Insight Media newsletters and Display Daily columns."

By Bernard Mendiburu, DisplayDaily

3D on the NAB Radar

"I have just returned from a brief trip to NAB, which is a big show with a lot to see. Fortunately, Matthew Brennesholtz, Aldo Cugnini, Bernard Mendiburu and Pete Putman were also there to help out with coverage. For this column, I think I’ll focus on 3D. The message — while 3D is not a mainstream topic of conversation at NAB, it is clearly on the minds of a lot of companies there, more than I would have thought. Here’s a high-level summary of some of the things I saw. For complete coverage, I am afraid I need to ask you to subscribe to our newsletters (it’s tough to do this for free, you know).

Matt had a chance to attend the Digital Cinema Summit, hosted by SMPTE, which ran over the weekend before NAB opened. There were lots of discussions about 3D movie making, the 3D cinema industry and 3D technology. What caught Matt’s attention in particular was Real D’s description of their "light doubling" technology for 3D digital cinema. It was what Matt expected — a polarization-recycling scheme — but good to have that confirmed now.

There was also a Content Theater set up, with Monday mostly devoted to 3D. I didn’t catch all of this, but did witness a live demo of 3D transmitted in a standard video stream, which was displayed on a projector and two LCD TVs. The demo was conducted by 3ality and was quite a hit. This was followed by a replay of three NBA games that were shot with the help of Vince Pace. The footage was quite good, but the tales of acquiring it were also revealing.

Later, I met with NHK, which has now started to broadcast three or four 15-minute segments of 3D content per day in Japan. To support this, they have a crew that goes and shoots 3D every day and is rapidly learning the ropes in this new medium.

I also met a supplier of 3D recording gear who told me they are working with a US supplier to outfit 3D trucks to do about 30 live events this year. Wow!

We also saw several 3D camera rigs and new 3D cameras at NAB, with perhaps a half dozen other suppliers I didn’t have time to meet personally. However, coverage of these suppliers and technologies will be included in the next issue of Large Display Report.

TDVision gave a very nice briefing on their 3D encoding scheme that will allow 3D movies to be written to a Blu-ray disc in a proprietary format. The main advantage here is that the disk will play in any Blu-ray player in 2D or 3D. In 2D, the extra 3D information is ignored and if you have a 3D-capable TV with a TDVision decoder, you can see the movie in 3D. Unfortunately, if you have a 3D-capable TV with an incompatible decoder from another manufacturer, you are out of luck. Hopefully, the 3D industry has the good sense to avoid a format war.

We also saw the Hyundai and SpectronIQ 3D LCD TVs, which can play 2D or 3D content. Both feature an x-pol assembly laminated to the panel that rotates the polarization for alternative rows, allowing simple passive glasses to separate the two images.

The Hyundai TV is on sale now in Japan for about $5,000 and the Spectron IQ set is expected this fall at around $6,000. I plan to take a look at the retailing of these Hyundai sets while I am in Japan next week.

And, we announced the official launch of the 3D@Home Consortium to a room full of new members, interested parties and members of the media. We revealed a cast of 22 international companies led by Samsung, Philips and Disney. Look for more companies to join this effort in the near future.

I am particularly happy to see the enthusiastic embrace of this industry-led consortium. They see the value in working together to remove the roadblocks and speed adoption of 3D into the home. And getting this content to the home - the job of the NAB crowd, will be critical. That’s why I was also pleased to see attention and resources from major NAB companies turning in this direction. Maybe 3D TV will indeed be the next big thing."

By Chris Chinnock, DisplayDaily

Stereoscopic is the New 3D

"In 2007 Axis Films co-sponsored a set of HD camera tests, in which a battery of digital cameras, from the Viper FilmStream to prosumer HD camcorders were meticulously run through their paces and tested against each other and workhorse film cameras. In early 2008 Axis hosted another illuminating symposium, this time on the emerging 3D production and exhibition technology.

Axis was joined by The 3D Firm, Can Communicate, Inition and Quantel. Over 200 guests attended the two-day event, which offered a demo and ongoing workshop on the latest in 3D capture, post production and exhibition technologies, with particular emphasis on broadcast application.

It is difficult to open any publication about the media industry without reading or hearing someone extolling, or damning, the economic and aesthetic attributes of 3D exhibition. The showcase at Shepperton gave industry technicians and producers an opportunity to look beyond the hype and get some real facts.

First of all, if you want to sound like you know your stuff — avoid saying “3D”. Say instead, “stereoscopic”. The terms are interchangeable, but stereoscopic video is the accurate description of the medium.

As with stereophonic audio, the effect relies on only two sources of information — left eye and right eye. The stereoscopic effect encourages the brain into believing it is observing objects existing in a 3D space, in the same way that a two-speaker audio system encourages the brain to believe it is hearing sounds from a multitude of sources — when it fact there are only two. The 3D name is better suited to marketing and advertising, stereoscopic for the real production process.

Stereoscopic post production has been revolutionised in the digital age. Quantel’s Pablo was on show at the Axis demo, dazzling attendees with its deftness in handling 3D editing and post. The Quantel system, used extensively in many phases of post production, requires very little reconfiguring to manipulate stereoscopic data. Given the saturation of visual effects and compositing content throughout the industry, most post production workstations are ready to handle stereoscopic moving images.

In fact, most of 3D production and post production is fairly unremarkable. To say it is the same as conventional production, but with one more camera, would not be too far off base.

Hidden in plain sight
A downside to 3D film production in past decades has been the simple mechanical challenge of getting film elements to register cleanly. Not only did negative in the camera have to register properly in order to produce the elements for a clean 3D image, but then diverse film projectors in diverse theatres had to project the two film elements in sync with precise calibration of the overlap of the two images.

Digital technology now allows perfect synchronisation and overlap of stereoscopic elements, which can be exhibited perpetually with the zero degradation in quality. This advance in production — and the greater standardisation of exhibition parameters necessitated by digital technologies — have further opened up the opportunities for stereoscopic broadcast.

Many projection facilities have projection equipment that can accommodate 3D content, though the number of stereoscopic theatrical releases is relatively few. If 3D is to become a widely distributed feature of broadcast, widescreen 3D releases will not provide any great percentage of the content. So where would a regular supply of 3D content come from?

There is a surprisingly large amount of 3D content hidden in plain sight. Shown at the Axis demonstration were 3D colour newsreels of young Elizabeth II — an example of the unique treasures hidden away in archives, some of which have remained virtually unviewed for decades. As Turner leveraged its MGM archive into one of the great cable movie channels, TCM, there are vast 3D libraries ready to be digitised for broadcast. Digital post technologies allow easy, on-the-fly cleanup of these film originals. The Quantel at the Axis presentation showed off the ease with which negative dust and scratches were erased from the digital elements of the Queen Elizabeth footage.

Today’s effects-rich media, in which even the most humble productions feature some 3D graphics or compositing work — in title sequences, at least — is another untapped gold-mine of stereoscopic content. The great open secret of 3D programming is that every frame that comes out of a 3D graphics program is ready for immediate adaptation to stereoscopic motion pictures. It already exists as a 3D image within the computer and with the term ‘rendering time’ becoming an anachronism, outputting the POV of a second virtual camera can be done, almost literally, at no extra cost. Also, the conversion of 2D productions to 3D is a rapidly developing specialty. At its most basic, the process uses simple, familiar compositing technologies. From the 2D footage, a background plate and other elements of characters or foreground are created. Multiple layers of these can be manipulated along the z-axis like cardboard cut-outs in a diorama. On the other end of the spectrum are more sophisticated technologies which calculate entire, detailed 3D spaces out of existing 2D footage, which are beyond the scope of this article — for the time being.

As with most broadcast technologies which showcase visual spectacle — HD programming springs to mind — new 3D content tends to be confined to sporting events, stage performances, and nature programmes. The 3D family melodrama has yet to be made. These spectacle types of entertainments are designed to directly engage a viewer on a visceral level, and the stereoscopic experience — like the HD, 5.1 surround experience — has the potential to augment that.

Another, more subtle element is that these types of content emphasise the documentation of a real event — generally one in which the audience maintains a static point of view. 3D presentations can often mimic the experience of watching something from a single point of view, the illusion sometimes being interrupted when the camera begins to move. If stereoscopic production and post are not handled skillfully, a moving camera can irritate the viewer rather than enhance the 3D effect.

3D for HD broadcast
The elephant in the room regarding the new revolution in 3D-push is: ‘Is it really a new revolution? Or is it the same old thing one more time?’ The truth is, at least one journalist — though fascinated and inspired by the technology — left the Axis 3D presentation with stinging eyes and a headache.

Stereoscopic photography was developed in the 1840’s, on the heels of the photographic technique itself, and its basic principle has remained virtually unchanged.

Much press has stated that we are poised on the edge of a paradigm shift in which 3D presentation will be ubiquitous or, some would even argue, the norm. But stereoscopic theatrical exhibition was vigorously promoted in the 1950’s and despite continuing improvements in the technology, did not take hold as many hoped it would. Is this the old saw of repeatedly performing the same actions, but expecting them to produce different results?

Despite IMAX and other big screen 3D venues, the new outlet for 3D content might well be HD broadcast. 3D LCD monitors, including the Planar StereoMirror professional display were exhibited at the Axis demo, but for consumers to trade in their HD monitors — which themselves required months of nervous window shopping and saving — for 3D monitors will require a saturation level of 3D content which, at this juncture, would seem decades away. Time-tested technologies using conventional monitors, which can be viewed with special glasses, will be the standard 3D exhibition for the foreseeable future. The 3D Holy Grail of ‘no special glasses’ — beyond a few specialty venues — will not be adopted by home viewers.

The Beijing Olympics may well be the trial by fire for 3D broadcast. The Games will feature a channel dedicated to stereoscopic coverage of events. East Asia has remained at the forefront of 3D broadcast content, and it will be vital for European producers to study the behaviour of East Asian audiences and the strategies of their broadcasters. The Beijing ‘3D Olympics’ will also be a laboratory for a dedicated 3D production workflow and 3D troubleshooting and problem solving in multiple settings.

One shocking fact presented at the Axis Films workshop might be enough to rock the foundation of every 3D business plan in the works. Roughly 8% of the population cannot see stereoscopic video. This is due to a range of factors, including partial blindness or amblyopia (‘lazy eye’), focus difficulties.

Whether or not a broadcast revolution can be built on a technology that immediately excludes 8% of its audience remains to be seen."

Neal Romanek is a screenwriter and journalist living in London. He attended USC’s Cinema-TV Production program and writes for a diverse collection of entertainment media publications in Europe and the USA.

Source: TVBEurope