ESPN’s 3D channel is half way through a one year trial with which to prove a business case or it may be pulled from the air. The network, which launched in June carrying 25 FIFA World Cup matches and plans to produce 94 live events in its first year, will have its future reviewed in early 2011.
“We committed to a full year of trial of ESPN 3D and we’re preparing for a second year, but whether this is something we repeat or continue or cut is something that at this point we have very little indication on one way or another,” ESPN Senior Director of Technology, Jonathan Pannaman, told the Sports Broadcast Europe conference.
“We’re still not sure what makes sense for 3DTV and we don’t yet see a proven ROI,” Pannaman said. “At the same time the buzz is huge and we are hopeful of a huge push by the consumer electronics association ahead of Christmas to market 3DTV sets and services, and we’re seeing more stereo 3D movies and 3D Blu-ray discs come to market.
“Regardless of whether we continue as an event-based network or go to a 24/7 network switch to VoD, we definately have to make production efficiencies to make it work. We’ve also got to get more eyeballs looking at 3D to get some idea of acceptance in the marketplace.”
Pannaman is leading the sportscaster’s 3D task force whose premise he said is to find technology that will allow it “to do ubiquitous production of 3D with an absolute minimum of additional cost” over 2D production.
“That’s a tall order,” said Pannanman. “The current approach is based on the film model but it’s our focus to reduce and change that. We have to bring in more automated rig correction, even to the point where there is a single workstation which can manage many tasks. Currently we are fielding a convergence operator for each camera position. That economy can’t be allowed to continue.”
The broadcaster is using ESPN Wide World of Sports, a new theme park experience at Disney World and the largest multi-sport facility in the US, to organise a week long test session for 3D technology in December. All major rig manufacturers and 3D acquisition suppliers will be invited to set up their systems on a variety of sports events for a side by side shoot out.
“This is a big bang theory to test how each manufacturer’s rigs and conversion technologies work,” said Pannaman. “ We will test everything.”
ESPN is also to conduct a major study into depth metadata and depth analysis.
“This is major topic which impacts events downstream. For example we need to think very carefully about how we place closed captions and graphics in stereo. Do we need to develop some automated alarm which will alert us before we go to air whether a graphic is going to occlude the image? These are monumental challenges which are vital to enable us to produce good 3D.”
By Adrian Pennington, TVB Europe
ESPN’s 3D channel is half way through a one year trial with which to prove a business case or it may be pulled from the air. The network, which launched in June carrying 25 FIFA World Cup matches and plans to produce 94 live events in its first year, will have its future reviewed in early 2011.
Blackmagic Design announced a new software update for HDLink Pro 3D DisplayPort, the world's most affordable 2D/3D and 2K monitoring solution for HDMI, DVI and DisplayPort displays. This latest update adds HDMI 1.4a output support for full resolution 3D stereoscopic monitoring.
Traditionally, monitoring stereoscopic 3D video means having to compromise on the image quality as 3D encodings such as Side by Side or Top and Bottom throw away half of the picture resolution in order to combine both left and right eye video into a standard video frame.
With this latest 3.5 software update, HDLink Pro 3D DisplayPort now also supports 3D monitoring using the HDMI 1.4a standard. This means that both the left and right eye video from the HD-SDI connection is packed into an extended frame dimension and sent simultaneously to a HDMI 1.4a compatible 3D monitor, providing stunning full resolution 3D images without sacrificing picture quality.
The new HDLink 3.5 software update is available now free of charge from the Blackmagic Design website to all Blackmagic Design HDLink customers. HDLink Pro 3D DisplayPort is available from Blackmagic Design resellers for US $495.
Source: Blackmagic Design
Minnetonka Audio Software announces a new SurCode for Dolby E product designed to work inside all common video post production platforms, including Avid’s Media Composer. The SurCode for Dolby E Stream Player plug-in provides real-time decoding of Dolby E streams and files to a stereo output.
Based on embedded metadata, the real-time downmixing of 5.1 to stereo is automatic, allowing the operator to concentrate on the task at hand. Downmixing is a true emulation of what a consumer equipped for stereo playback would hear. Since Dolby E is commonly used to simultaneously carry a surround mix, a stereo mix and/or multiple foreign language tracks, the SurCode for Dolby E Stream Player plug-in also lets the operator choose which program to decode and monitor within the range of available program configurations.
Visual indication of bit depth and frame rate are also provided, as is the display of all metadata extracted from the Dolby E stream or file. The plug–in includes the ability to save existing metadata as either a text or XML file, for archiving, workflow automation, and media asset management.
The SurCode for Dolby E Stream Player plug–in is compatible with the RTAS, VST, AU and AudioTools AWE formats, and is available from Minnetonka Audio Software online or at your local retailer. Manufacturer’s suggested retail pricing is set at US$995 for an all-platform license.
The catalyst for today’s Display Daily article was a comment offered by Josh Greer, the President and Co-founder of RealD at the 3rd Annual 3D Entertainment Summit recently held in Burbank. Greer stated that consumer electronics companies are targeting April 2011 for an initial offering of LCD 3DTV’s utilizing passive polarized glasses and based on RealD’s ZScreen technology.
This article is a "think-through" of the prospects for this technology as a competitor in the home 3DTV marketplace. Let’s start with a quick lay-of-the-land.
The current state of the art in 3DTV technology is based on the use of shutter glasses. The advantages of this approach include the production of a decent 3D image and the fact that the technology can be integrated into current generation LCD and plasma displays at little additional cost. The consumer is basically buying a conventional, albeit higher end, 2DTV. If they want to enable the 3D option, then they spend the money to buy "expensive" shutter glasses. The other disadvantage is, of course, the awkwardness associated with wearing shutter glasses.
At the other extreme, the Holy Grail in 3DTV is a glasses free technology. Despite recent developments in such technologies, the simple fact is that it is just not ready for prime time and commercial products are likely years off.
This leaves the middle ground: 3DTV based on passive glasses. Such an approach has users wearing inexpensive and presumably comfortable glasses. These glasses do not require batteries or an IR emitter.
One such passive technology has been around for a while. It is called patterned retarder or MicroPol. To implement this technology, a sheet consisting of an array of waveplate stripes is precisely positioned on the front of the LCD. This is an expensive component. Adding to the expense is that fact that the logistics of the supply chain are very unfavorable with fabrication and application of the patterned retarder sheet currently available principally from a single supplier based in Japan. Other disadvantages of the approach include the fact that the vertical resolution of the 3D image is reduced by half and the vertical viewing cone is quite restricted. Although undesirable, neither of these later disadvantages are necessarily killer problems.
RealD’s flat screen ZScreen approach is likely a descendent of the computer monitor ZScreen developed by the seminal 3D company, StereoGraphics, which was acquired by RealD in 2005. It is basically a screen-sized electro-optical polarization switch. As was the case for the patterned retarder sheet, the addition of a ZScreen constitutes an expensive modification to a conventional LCD 2DTV.
The ZScreen approach does, however, have an important advantage. Since the ZScreen operates in a time sequential mode, the resolution in the 3D mode is not reduced from that presented by the display in the 2D mode. Another potential advantage is that in principle, it can be cost effectively produced by existing manufacturers of LCDs and assembled by existing module makers.
On the down side, the addition of the ZScreen does introduce a reduction in image brightness, as well as some angular dependence to the quality of the 3D image.
When the advantages and disadvantages are added up, the patterned retarder approach has not been able to capture any market share. The story for the ZScreen may be different. Greer set reasonable expectations by stating that he does not anticipate the RealD passive 3D approach to replace current shutter glasses based 3DTV offerings but, rather, to co-exist and carve out a portion of the market. I would agree, the combination of advantages and trade-offs offered by the ZScreen approach should allow it to achieve this modest level of success.
By Art Berman, Display Daily
Can you manipulate the 3D positioning of video tracks in 3D space?
You can do this in a couple of different ways in Vegas Pro 10. First, you can use the 3D track and parent compositing tools that have been included in Vegas Pro for several versions. These tools enable you to create 3D perspective and manipulate tracks on three-dimensional planes.
Now, with the new stereoscopic 3D editing tools, you can also adjust and add 3D depth to your projects. You can use the Stereoscopic 3D Adjust filter to change the horizontal offset of your stereoscopic clips and subclips to bring the 3D image closer to or farther away from the plane of the viewing screen. You can use the same filter to create 3D depth on the text and other 2D elements in your 3D project. These elements will still be flat (since they're two dimensional), but you can use the same tools to place them in front of or behind the screen plane.
Finally, you can use the new Stereoscopic 3D Camera tools in the Track and Parent Motion windows to adjust the depth of tracks that you have placed in 3D space with the 3D compositing tools mentioned above.
Will it be possible to author 3D DVDs or Blu-Ray discs in DVD Architect using video that you created in Vegas?
Yes. Since you can render your 3D projects out in any of the supported render formats, you can render them for import into DVD Architect and then burn them to DVD or Blu-ray disc. However, the current version of DVD Architect does not create 3D Blu-ray discs (3D BD) for Hi-definition 3D on Blu-ray. So, while yes you can burn a Blu-ray disc that contains 3D content, you cannot yet use DVD Architect to create a hi-definition 3D BD disc.
Would it be possible to create a simulated 3D from a standard AVI?
You can use the Stereoscopic 3D adjust filter to create horizontal offset on a 2D clip. This enables you to move the clip back and forth from the screen plane. In some cases, depending upon the nature of the footage you're using, this may create an acceptable 3D effect, but there is no tool in Vegas Pro 10 for "turning 2D into 3D."
Does Vegas 10 use a motion tracker?
There is no built-in motion tracker in Vegas Pro 10. However, there are third-party motion tracking tools that work as plug-ins to Vegas Pro. For instance, the new Boris Continuum Complete 7 plug-in from Boris FX works great with Vegas Pro 10 and does a very robust job of motion tracking.
Does Vegas Pro 10 support the Cineform Neo3D codec?
Yes. Vegas Pro 10 can open Neo3D with the free NeoPlayer, or with NeoHD, Neo4K, or Neo3D installed. You can render Neo3D with any of NeoHD, Neo4K, or Neo3D installed.
Can you set up Vegas to always use Stereo 3D when you open a new project?
Yes. In the Video tab of the Project Properties dialog box, set your properties the way you want them. Then, select the Start all new projects with these settings checkbox and click OK. Now every time you start a new project, it will start with the property settings you specified.
Does Vegas Pro 10 support using the NVidia 3D cards as a Secondary Windows Display? Will there be support for Nvidia 3D Vision or Quadbuffer OpenGL for 3D preview?
At this time, we support nVIDIA 3D Vision Pro or 3D Vision for Quadro setups.
How can I render a 3D project with two separate video streams for left and right?
If you want to render your 3D project to two separate files (one for Left-eye and one for Right-eye), then (in the Render As dialog box) click the Custom button. In the Custom Settings dialog box, click the Project tab. Now, select ‘Left only’ from the Stereoscopic 3D mode drop-down list. Next, repeat these steps and in the Project tab of the Custom Settings dialog, select Right only from the Stereoscopic 3D mode drop-down list.
If you’ll frequently need to override your project property settings with this setting in the future, create a custom template with these settings so that you don’t have to perform this customization step every time. When you’re done, click OK to dismiss the Custom Settings dialog box and continue your render as normal.
Do the new stereoscopic 3D tools in Vegas work with generated media or text? For example, can I create a lower third in Stereo 3D?
Yes. You can apply the new Stereoscopic 3D Adjust filter to a 2D image (including generated text and graphic media) and adjust the Horizontal Offset value to move the image in front of or behind the screen plane. You can also use the new Stereoscopic 3D Camera properties on a track that you've set into 3D compositing mode (with either the Track or Parent Motion tools) to move your 3D tracks closer or further away.
In a stereoscopic 3D project, does the Stereoscopic 3D Adjust plug-in filter have tools to make adjustments to parallax?
Yes. You can use the Horizontal Offset slider to move the two images closer together or further apart (thus adjusting the parallax). This moves the resulting 3D image closer to or further from the screen plane (both above and below screen level).
Does Vegas 10 pro support 3D compositing through layering?
Yes. While Vegas has supported 3D compositing for some time, the new dimension added in Vegas Pro 10 is the addition of a stereoscopic 3D camera for 3D track compositing. What this means is that Vegas will compute the 3D track compositing from two different virtual camera positions, creating true stereoscopic 3D output based on the positions and depths of your elements in 3D space.
What software creates these to begin with or is it used with a stereoscopic rig?
There are three stereoscopic 3D scenarios. First, you could have a single file with two streams of video. These files could be created by a single video camera that has two lenses (there are a few of these on the market already). Each lens shoots the image from a slightly different angle and the video from each is stored on separate streams in the resulting video file.
Second, you may have a single file with a single stream of video. The left and right video material is somehow separated in these files, for instance side by side or one above the other. Some cameras shoot this type of file and they can also be created with software tools.
The third scenario — and perhaps most common at this time for Vegas Pro users — is where you have two separate files that were shot on two separate cameras which were mounted next to each other on some sort of 3D camera rig. Vegas Pro 10 supports all of these scenarios.
Can I preview on a professional external monitor while editing a 3D project in Vegas Pro?
Absolutely! You can use the Preview on External Device tools to send your output to your 3D monitor. You can also set your Preview Device preferences to use either the project stereoscopic 3D mode that you established when you set your project up as 3D, or override the project settings and use whatever stereoscopic 3D mode your professional external monitor supports, even if that mode is different than the mode you're using to preview your project in the Vegas Pro Video Preview window.
Is this 3D technology the old type where you have to wear the blue and red lens glasses to see it, or the new technology for creating it?
Vegas Pro 10 supports both "old" and "new" technologies. The "old blue and red lens type" you're referring to is anaglyphic 3D technology and Vegas Pro supports red/cyan, green/magenta, and amber/blue anaglyphic modes (red/cyan is the most common and likely the one that you refer to in your question).
In addition, Vegas Pro supports stereoscopic 3D modes that require polarized 3D glasses and even the sophisticated active-shutter glasses that (arguably) give the best-quality 3D viewing experience.
Can you explain the differences between 3D glasses? For example, why are their red and blue glasses and what's different between these and the ones they use at movie theaters? Are there other types of 3D glasses?
There are three common types of 3D glasses in use now. Each of these glasses do the same basic thing: they filter video information before it reaches your eyes so that your left eye sees something different than your right eye does, thus causing your brain to perceive the image in 3D.
The “red and blue” glasses that you mention are actually red and cyan and they come from a class of stereoscopic 3D called anaglyphic. These glasses work by filtering out specific colors from the video so that the left eye sees a differently colored image than the right eye does.
Another type of 3D glasses uses polarized lenses to filter the video information before it reaches your eyes. These require a special display that polarizes certain pixels for one eye and certain pixels for the other eye.
A third type is called active shutter. These glasses (the most sophisticated of the bunch because they can support high-definition 3D) have lenses that actually open and close extremely rapidly in alternating succession. You use them to watch a video monitor that sends the left and right eye information in an alternating succession that exactly matches that of the glasses so that when the monitor sends out the right eye image, only the right shutter on your glasses is open, so only your right eye sees the image. Then both monitor and glasses switch to the left eye, then back to the right. Of course, this all happens so rapidly that you don’t perceive it and you see a cohesive 3D image.
You can deliver format-agnostic content (like side by side) that can be watched using different technologies. For example, YouTube 3D takes side by side but the playback viewer can reformat it for different monitor technologies. Side by side on a BD or DVD can be watched on a passive polarized monitor or on an active glasses monitor. Vegas Pro 10 enables you to render files that will work with any of these types of glasses.
If a Stereoscopic 3D video is shot with two cameras, can these be any regular cameras?
Yes. There doesn't have to be anything special about the individual cameras. However, you'll get the best results if you use the identical model camera for both and use identical settings on the cameras so that the images they shoot will be as closely matched as possible.
You'll also want to make sure that the cameras are mounted together in a very specific manner so that you don't have alignment problems. Vegas Pro 10 features the new Stereoscopic 3D Adjust filter to solve many alignment problems, but it's best not to have the problems to begin with!
Does Vegas 10 import and export 3D-avi files as generated by the Fujifilm W3 camera? Can these be edited on a single timeline and then exported as 3D-avi or side-by-side and individual left & right files. What levels of compression are available?
Vegas Pro 10 supports files from the Fuljifilm W3. These files can be edited on the timeline just like any other files. You can create the side-by-side and individual left & right files. However, Vegas does not render to the 2-stream AVI files. The level of compression is based on the format and template selected.
Editing Stereoscopic 3D in Vegas Pro 10
Another video tutorial about Sony Vegas Pro 10 (3D part: 08:40 -> 20:30).
Source: Sony Creative Software
This White Paper presents an overview of over the top (OTT) streaming and how it fits into the IPTV and VOD markets. It explains the principles of OTT, considers the differences between OTT and IPTV, looks at the challenges facing this new approach to service delivery and presents the three major contenders aiming to become the industry’s technical standard.
Thursday, October 28, 2010
Labels: OTT TV
The last 5 years have seen the successful re-introduction of 3D stereoscopic content in movie theatres, with high peaks of interests shown by spectators in 2009. Consequently, the manufacturing of 3D TV sets and the deployment of 3DTV infrastructures have started in 2010, as it is widely accepted that the next step of entertainment evolution in the home is going to be a transition from HDTV to 3DTV.
All the solutions that are envisaged so far are based on similar concepts than the ones used in cinema, that is to say using glasses (polarised or shutters). It is then natural to foresee that, for home application, there will be a trend in the coming years to improve even further the level of immersion of the spectator into the scene by suppressing the need of glasses (e.g., through the use of auto-stereoscopic displays, light field displays), so as to introduce new 3D audio experiences and more compelling 3D interactivity platforms.
To support this action, the MUSCADE Consortium organises a 1-day 3DTV Workshop targeting the production, transmission, rendering and display of more immersive 3D content than just 3D-Stereo. This workshop will take place in Rennes, France close to the R&D Centre of Technicolor, one of MUSCADE's partners, on 16th December 2010.
In addition to keynote and paper sessions, some demonstrations illustrating the concepts described above will be presented. This event is envisaged to provide a great opportunity for disseminating and promoting the accruing MUSCADE technologies among the attendees.
Click here to access the program of this one-day workshop.
To register to this workshop, please click here.
Thursday, October 28, 2010
Film processing and post-production facility iLab is expanding its visual effects capabilities with a view to post-converting 2D features and TV projects into stereo 3D.
Its Poland Street office will grow to include at least 20 extra seats for conversion work. Tom Horton, a former Molinare VFX producer, will head the VFX operation, while David Fowler, formerly of MPC, joins as head of technology.
The move comes after iLab’s parent, Indian communications giant Reliance Media Works, ended its deal with conversion company In-Three. The plan had been to establish in India what Reliance said would have been the world’s largest 2D to 3D conversion facility. The In-Three and Reliance partnership, announced last December, was expected to be able to convert 15-25 feature film projects a year.
“In-Three has terrific technology, and we’re keen to continue the cooperation, but we agreed to end that part of the relationship and we are now in talks about licensing their technology,” said Patrick von Sychowski, head of strategy for Reliance Media Works in London.
By Adrian Pennington, Broadcast
ITRI (Industrial Technology Research Institute) introduces i2/3DW, an innovative technology to integrate 2D and 3D information for simultaneous display on the same screen visible to the naked eye.
ITRI's award-winning i2/3DW is a next generation 2D/3D switchable display technology that is the first to fully and simultaneously integrate 3D displays for the naked eye with traditional 2D information. This breakthrough solves the problems previously associated with 2D/3D displays -- a lack of integration forcing viewers to switch between 2D and 3D modes -- and 3D displays -- blurry text and specific eyewear. With i2/3DW, 2D texts are as clear as they are on a 2D screen and 3D images are as fascinating as on a 3D screen, but can now coexist on the same screen for optimal viewing quality.
The construction of an enabled i2/3DW display is comprised of three primary component layers: the conventional liquid crystal display panel (LCD panel), the dynamic black-light unit (DBLU) and the 2D/3D switching layer -- that lies in between the LCD and DBLU panels, allowing the 2D and 3D display mode to be switched automatically. This feature differentiates ITRI's i2/3DW technology from its competitors -- to date, similar technologies have only focused on whole screen 2D or 3D display. i2/3DW is the first to make the integration of a partial switch possible. ITRI's switching component is made of two polarization films, one microretarder and one low-resolution LC panel -- all extremely inexpensive to make, making the i2/3DW technology affordable.
Microtips Technology announces new manufacturing capabilities of 3D active shutter glasses. Microtips, with its years of experience in the LCD display industry, is investing extensive research and engineering in the growing world of 3D.
The glass spec consists of:
- Switch time with 10-90% of white light through in 1.3~1.5ms
- Switch time with 90-10% of white light through in 0.2ms
- Initial delay from open-10% of white light through in 0.7ms
- Head-on extinction ratio of white light (LCD R+G+B) is 2000:1
- Head-on extinction ratio of worst R: 475:1, G: 1375:1, and B: 501:1
- 70% Transmissivity for aligned polarized white light
- Power dissipation switching 120Hz, 2.4ms
- Off-angle extinction ratio of white light 20° left through right lens is 200:1 and 1000:1 at 15°
- Off angle extinction ratio of white light 25° downward is 200:1 and 1000:1 at 15°
With multiple scriber machines, Microtips Technology has the capability to cut the LCD glass to meet any required size and shape. Microtips Technology brings customizable options paired with top quality materials. A pair of glass will cost around $3.00 ~$4.00 range.
Source: Microtips Technology
The Rachael Ray Show, along with 3-D Vision are taking a major step forward in the exploding 3-D TV market. On October 29, “Rach’s Halloween Bash in 3-D” will be broadcast to millions of viewers with 3-D Vision’s revolutionary new 3-D process called “FullColor 3D”.
The Rachael Ray show is the first to use this revolutionary 3-D process in a commercial TV broadcast (sponsored in part by Sarah Lee). The show will be viewable in full-color and in 3-D on all existing TV sets, 2-D and 3-D, thanks to a new type of 3-D glasses which will be given away to over 2.4 million viewers in the October 25 issue of TV Guide Magazine.
Previous 3-D broadcasts shown on regular TV sets over the past 50 years used red and blue (or red and cyan) glasses referred to as “anaglyph” glasses. Although they provided 3-D, full color viewing was not possible with these glasses. In addition, due to an inherent brightness imbalance between the left and right eye filters used in the glasses, prolonged viewing created discomfort, eyestrain, and even headaches, imposing shorter 3-D segments.
Recently, a new 3-D process called “ColorCode”, used during a Super Bowl halftime show broadcast, produced an even greater brightness imbalance between viewers’ eyes.
The new “FullColor 3D” process, however, uses glasses that eliminate these problems by providing full-color images in 3-D, utilizing patent-pending balanced brightness and color filters, that can be worn during an entire 3-D show without discomfort. 3-D Vision is now making this new process available for use with all TV broadcasts. The same glasses also provide full-color 3-D viewing of all printed images for the first time ever, as demonstrated in the same upcoming October 25 issue of TV Guide Magazine.
Another revolutionary aspect of the new 3-D process used for “Rach’s Halloween Bash in 3-D” is the way the 3-D was made, allowing completion in a short time at a reasonable cost. Normally, 3-D shows can either be shot directly in 3-D using special expensive 3-D camera rigs operated by specialists with expertise in 3-D stereography, or they can be shot in conventional 2-D video and then converted to 3-D using a complex and expensive computer-assisted process that requires scores of graphic artists working simultaneously to convert the video to 3-D, frame by frame. Such conversion of a single full-length movie, for instance, can take several months and cost from $5-15 million just for the 3-D conversion alone. In contrast, using 3-D Vision’s revolutionary patented “Auto-3-D” conversion process, the Rachael Ray 3-D episode took only 2 weeks to convert, utilizing only 3 computer operators, and was done at a fraction of the cost of conventional 3-D conversion.
Another advantage of this new “Auto-3-D” conversion process is that it can produce a more natural-looking 3-D experience than the conventional conversion methods, which often produce images that look like a series of flat cardboard-like planes. This occurs because, in the conventional 3-D conversion processes, the graphic artists assign depth values to various parts of the image based on their own guesses about depth within the frame. The “Auto-3-D” process, on the other hand, uses actual 3-D information present in 2-D video frames and automatically displays scene components at different depths based on the depth information in the original frames.
3-D Vision developed these revolutionary patented and patent-pending technologies over the last 5 years based on its research on how the human brain works and creates the experience of 3-D, as well as on optics and TV technology principles.
Source: 3-D Vision
Advanced 3D Systems are proud to announce a new ortho-stereoscopic 3D digital processor. The Stereographer’s Friend offers real-time, low-latency Digital Processing in a compact 1RU frame, suitable for live Stereoscopic production, drama/feature production or post production.
The Stereographer’s Friend allows real-time adjustment of Digital toe-in and can correct for all the regular rig errors such as roll, vertical misalignment including lens corrections.
- Real-time Stereoscopic 3D Processor
- 1 Frame total processing delay
- Two HD-SDI Inputs
- Two HD-SDI Outputs
- Inbuilt Synchronisers
- Analogue Monitoring Output
- Compact 1U Chassis
- Auto Zoom
- Vertical Disparity
- Banana Distortion Compensation
- Flip H/V
- Keystone H/V
- For quick recall of presets during live performance
- Smooth match depth presets
- DVI Viewer
- Left, Right, Mix & Diff Modes
- Advanced Analysis Package
Anthony Rose, CTO at YouView (formerly called Project Canvas), has used a number of events recently to outline what YouView is and how it is going to change television. The YouView project was described by one moderator at the Streaming Media Europe conference last week as a “Pathfinding project for the evolution of TV” and that is unlikely to prove an over-statement, given the way the hybrid IP/broadcast model will start to influence what people watch and how they watch it during the rest of this decade. Below is an outline of what has been revealed recently.
- YouView could be decoupled from Freeview via IP-only devices.
- An abstraction layer means content deals can relate to the YouView ‘platform’ and cover multiple devices, reducing the complexity of content licensing.
- The Marlin MS3 DRM will be free to use for content providers. YouView will initially support this DRM only but Flash Access will follow.
- The platform is open to multiple payment service providers, opening the way for ISPs to partner with content owners and add purchases to their broadband bills.
- YouView could support open codecs apart from MPEG-4 if they are supported in hardware.
- The platform is expected to trigger the uptake of IP Multicast by ISP networks and consumers, driving down the cost of ‘broadcasting’.
- Content discovery is critical and will not rely on the EPG (Electronic Programme Guide).
- YouView is central to ambitions for 30-40 live BBC streaming channels during the 2012 Olympics, with social interactivity.
- Linear TV is viewed as a key driver for content consumption across the platform. The migration to VOD is taking longer than expected because the “Silicon Valley kids deny the existence of linear”.
- YouView will be operated on a cost-recovery basis.
Rose describes YouView as an open platform with three components: open standards, some client software that YouView has developed to run on that platform, and a metadata ingest service. In terms of video delivery, the codec supported today is MPEG-4 within an MPEG-2 Transport Stream, with Flash providing the rendering layer. Rose says there is no reason why the company would not support other open-standards codecs but they need to be supported in hardware to provide the expected performance.
YouView is based on open standards and the specification documentation is now available at the YouView website. “Anyone can make a box and they do not need to call it a YouView box,” he noted. “They can do this in the UK or for other markets.”
Within the platform, the managed YouView code sits above the D-Bus and the OEM manufacturer code sits below the D-Bus, with the aim that the YouView user interface (UI) can flourish on multiple devices. One of the aims of the project is help STB or other device vendors create multiple product manifestations based on the same development work.
Rose promised that content will be made available on YouView on a fair and non-discriminatory basis. Content providers can build a player and use a ‘desktop’ application to drive viewers to their video portal site or they can provide metadata for the YouView system to aid discovery. Rose expects most companies to make use of the metadata ingest.
YouView will be operated on a cost recovery basis and does not aim to exploit the platform for commercial gain, Rose told delegates at Streaming Media Europe. The company will not take payments, which includes advertising revenue shares.
Freeview and Beyond
The first YouView set-top boxes will be Freeview HD PVRs (with two HD tuners). There will be opportunities to differentiate STBs on YouView, by including DLNA home networking capabilities and WiFi, for example. Rose suggested that in the year after launch YouView may be built into television sets and that some devices could be introduced without hard drives.
There is the possibility that YouView will not always be linked with Freeview: “We have said that at launch it is a Freeview+ proposition but there are variations on a theme. It is entirely conceivable to have an IP-only device, although that does not hit the sweet spot in my opinion.” Anthony Rose added that there is nothing to stop manufacturers from swapping out the UK DTT reception components for digital terrestrial technology suited to other regions.
Blending On-Demand and Live TV
Rose told the Streaming Media Europe conference in London that the key USP (unique selling point) for YouView will be the blending of live and on-demand content, pointing out that many new devices are focusing on IP-only. “The user ends up with two remote controls, one for the television and one for the world of on-demand and we think that is missing a big opportunity,” he said.
A few days previously Rose went further, telling an audience at the Mashup event that linear creates demand for what people want to watch. “You can switch on the television and linear TV it is just playing. We need a new way to create and drive demand where there is unlimited inventory. Many of the new devices we see from Apple and others almost deny the existence of linear.”
Rose admitted that when he began working on the BBC iPlayer catch-up service (in a previous role), he also thought linear channels were history. “But the world does not move that fast. I suspect moving to VOD will be much slower than originally expected and one of the reasons is because the Silicon Valley kids deny the existence of linear and the existing way that people consume content.”
YouView will provide regionalisation and localisation features by recognising where a consumer lives in the UK from the DTT signal they are receiving. That means they can be targeted down to one of 30 areas within the UK. Content owners can choose where their content is seen, which could include the whole of the UK (or even worldwide availability later).
YouView and the 2012 Olympics
YouView is regarded as a key enabler for the BBC’s planned 2012 Olympics coverage. Rose says the broadcaster will have 30-40 channels of live streaming and 5,000 hours of coverage from the event, which is clearly more than can be broadcast. The aim is to stream these (with the help of IP Multicast to support this scale of streaming).
YouView will provide new levels of interactivity. Whereas today the ‘Red Button’ interactive service on broadcast TV triggers an MHEG function, the Red Button on YouView will be able to trigger a Flash application. Rose said the developer community will be able to create apps that YouView could not conceive. “For me, that is the most exciting part. We don’t have to build the end proposition. We are providing the building blocks.”
He spoke about the ability to overlay widgets on top of live television so viewers could follow ‘Team GB’ and also track which events their friends are watching at the Games, giving them the ability to join them in watching the same content.
The Need for IP Multicast
YouView will support IP Multicast for live TV. At the Mashup event Rose noted the extreme alternatives for content providers trying to launch TV channels today. At one end of the scale is digital terrestrial broadcasting, costing £8 million for a DTT channel but then allowing broadcasters to add new viewers for zero additional cost. At the other end of the spectrum is unicasting, where you pay almost nothing to set up a video service but have to accept the relatively high cost of adding a new user each time the audience builds.
Rose expects IP Multicast to break this mould, with low up-front costs and a low cost for adding users. At Streaming Media Europe he predicted that IP Multicast will enable anybody to become a broadcaster. “You can be your own Big Brother,” he suggested. “You can set up your own television channel and broadcast to millions of people and be listed in the EPG.”
IP Multicast will increase capacity for live streaming over the web. Rose explained that the England vs Germany game in the FIFA World Cup this summer was watched by 800,000 people on BBC iPlayer despite being available on television, resulting in 800,000 simultaneous streams. This included HTTP live streaming in what must have been one of the first examples of this technology being used on such a scale. “The game used 30-40% of the UK consumer Internet capacity and that was for one game that was also available on television at the same time,” Rose points out.
Looking ahead to the 2012 London Olympics, he added: “Relying on unicast is not the way forwards so we think multicast is key. Five per cent of the Internet in the UK is multicast enabled and most users do not have plug-ins that can render multicast so there has been a ‘Catch 22’ situation where the ISPs do not bother supporting multicasting in their networks and consumers do not bother using the multicast plug-in.”
Rose expects this to change, helped by the support for multicast within YouView. (YouView shareholders also include the ISPs TalkTalk and BT so these are prime candidates to drive the expansion of IP Multicast in the UK broadband networks.)
Central Role for Content Discovery
At the ‘Where is the future for Multiscreen’ Mashup event, Rose cited Joost as an example of why content remains crucial for any service. “If you do not have great content, you have a problem. But also, in a world where there are unbounded possibilities in terms of quantity, how do you get great content to the surface?”
When speaking about YouView, Rose has always emphasised the importance of content discovery and he told the Mashup audience that somebody has to shape audience desire. “You need somebody to say: ‘This is for you’,” he said. Thus recommendation is viewed as a key part of the new platform. “The recommendation system helps you to be the trusted guide that should be able to surprise and delight.”
Rose believes YouView will provide the right balance between too little content on television and too much content on the Internet. As such, it will provide a good home for medium-sized content providers (as well as others). “A small to medium sized content provider will probably enjoy better performance than on the Internet because they are in a slightly smaller pond,” he predicted.
At Streaming Media Europe, Rose said the platform would provide multiple points at which consumers could engage with and find content. “One of the criticisms we have been hearing is that YouView will favour broadcasters on the EPG but our UX [user experience] teams are hard at work to explore various paths to content whether that is a category view of content or favourites or personalisation systems. “The EPG is only one of many ways to get to content”, he added, pointing out that content providers like Netflix (one of the many online content providers that want to get on the platform) do not have channels so will have to rely on these other forms of discovery.
Of course, the EPG will inevitably be an important focus for most consumers once they understand its key innovation: the fact that it will look backwards as well as forwards, with the backwards-facing portion showing off the content that is available on-demand (catch-up TV).
Reducing Complexity for Content Owners
Rose says YouView will help simplify the connected TV environment for content providers. He pointed to the challenges of getting content onto multiple connected TV devices, which requires separate development work and also separate deals with each manufacturer to get their content icon on the ‘desktop’. Working in the opposite direction, different licensing deals with Hollywood content owners are required for each platform content is presented on. “It is a bit like broadcasters having to have a new deal every time somebody makes a new television set,” he declared.
Rose said that because of the abstraction layer that YouView provides, someone can complete a content deal for YouView that then applies to a whole range of products. “Hollywood will be able to license once for deployment across all YouView boxes. “A content provider will be able to authenticate that a device is a YouView box and choose to make all their content available on all YouView boxes,” he told the audience at Streaming Media Europe.
Initially YouView will support Marlin MS3, the Marlin Simple Secure Streaming Specification. In due course the platform will also support the Flash Access content protection solution, Rose revealed at Streaming Media Europe. Thanks to the deal between YouView and Intertrust (which licenses Intellectual Property), the use of Marlin MS3 will be “essentially free” for content providers, he added.
Paid Content and Payment Mechanisms
YouView will support paid content but the company does not want to become a ‘king maker’ by choosing a payment gateway provider, so will ensure that different payment gateways can integrate with the platform so content owners can choose their payments partner. Rose suggested, as an example, that if a consumer is using a given ISP network, the ISP could take care of the billing (on their bill) on behalf of the content owner. This presents the interesting possibility that network owners can become the trusted billing partners for content providers, exploiting trusted brands, where they have them. Other billing options might include PayPal or Matercard, as examples.
Rose is convinced that the concept of advertising breaks will be consigned to history. He expects advertising innovations to enable advertiser messages without the traditional interruptions in viewing. One example of what might be possible, though not necessarily supported within YouView, is real-time insertion of advertising into the scenery of 3D content, like a real advertisement on a billboard that somebody walks past in a drama.
What he does expect to see on Youview is sponsored events and perhaps content-related companies leveraging the platform in new ways, like an opera house monetizing its content by selling tickets rather than from advertising revenues. Content owners will also be able to use information about what people are watching to help them target messages more effectively.
By John Moulding, Videonet
With the YouView (formerly Canvas) and HbbTV initiatives gaining momentum for the delivery of Hybrid Broadcast Broadband (HBB) services in Europe, there is a growing interest in whether there is enough common ground in the standards and technologies that underpin them to deliver some technology harmonisation. One of the biggest prizes could be the ability for content owners to develop services for HbbTV that can also be enjoyed via YouView-compliant devices. However Jeff Hunter, Chief Architect at YouView, believes that the more ambitious scope of the YouView project means it would be difficult for some services developed for this platform to run on the current, first generation of HbbTV devices.
Hunter is keen to emphasise that discussions about YouView and HbbTV harmonisation should distinguish between the technologies used to deliver services and the wider commercial requirements that drive them. The bottom line is that they are designed for different things. He notes that the initial push for the HbbTV initiative was to deliver strong interactive services for the digital TV market in markets like France and Germany, including a teletext replacement, and making services available quickly in particular on the current generation of iDTVs (integrated digital TVs).
YouView, on the other hand, is initially designed for the UK market where ‘Red Button’ interactive services are well established and widely used, and is looking to ‘move the needle’ in terms of services and features for a market used to interactive TV.
“Different territories are in different states of market development and that is part of the reason why we are seeing different business drivers,” Hunter says. “YouView shareholders already offer successful connected television services like BT Vision [the Freeview/IPTV platform from the UK telco] and online services like BBC iPlayer and 4oD [Channel 4’s online on-demand offer] and it is about evolving those businesses into another phase of the consumer offering. That requires a more capable platform.”
Hunter also points out that YouView is creating a complete ecosystem of content, services and device partners, plus a retail brand, so these different commercial requirements must be stripped out of the harmonisation debate. He says both the business outcomes sought by the HbbTV initiative and YouView could be perfectly viable.
“If you are harmonising technology, the starting point is to agree what the commercial requirements are. That is where the challenge has been in terms of full harmonisation [for HBB] not just at a European level but globally,” he observes.
However, when it comes to finding the overlaps in enabling technology between YouView and HbbTV, Hunter says there is already common ground and potential for more. There have been discussions about how the choices made for the different specifications could map into each other.
“Of course we want harmonisation,” he says, referring to the narrower technical definition. “There are benefits to all parties for using common enabling technology and that is through the economies of scale it delivers, leading to price erosion of consumer equipment and headend services.”
If it became possible to deliver HbbTV services on YouView specification devices it would boost the chances of YouView becoming a pan-European solution for Hybrid Broadcast Broadband. European broadcasters would be able to maintain their HbbTV services but deliver enhanced interactive services as well to consumers equipped with higher specification connected devices. Whether that happens or not, it is clear YouView has ambitions for its specification to be used internationally anyway, with the aim of creating economies of scale for device and content partners.
According to Hunter: “While the UK market is large, the device partners who are investing in YouView products really want to take that investment and use it on a global scale. If our content partners can package content for YouView and then make that available outside the UK that would deliver a fantastic cost saving for them so that is an important consideration for our shareholders as well [who include producers and exporters of content like the BBC and ITV]. It is important for our partners to have a technology solution that can be deployed into a base that is larger than the UK and the target for them is global. We are already working on making the YouView story into a broader story.”
Hunter adds that a large number of companies are looking at what YouView is doing in the UK and are extremely interested in the approach. “It is seen as a very forward-looking and visionary initiative,” he observes. He adds that YouView as a service does not have any exclusivity over the content created for use on the platform. So if someone produces content in a YouView friendly format there is nothing to stop them distributing it to any device based on the same technology.
This makes the publication of the YouView specifications all the more significant. These are now available on the company’s website and have been submitted to the DTG (Digital TV Group) in the UK, the organisation responsible for maintaining the ‘D-Book’ technical bible for the UK DTT platform and which will also provide test and conformance for Connected TV services. YouView became a member of the DTG last week. Over the coming months these specifications will also be translated into documents aimed at specific stakeholders like ISPs, content providers and advertisers to explain how they can work with the platform.
YouView already makes use of many existing and emerging standards from the DVB, W3C and OIPF (Open IPTV Forum), among others. “Where there are good specifications already in the standards domain we are looking to use those,” Hunter adds.
By John Moulding, Videonet
Over the Top TV (OTT TV), the delivery of video via the internet directly to user(s) connected devices, allows access to services anywhere, anytime and on any device.
The purpose of this White Paper is to introduce and describe a number of leading OTT TV platforms and to compare and contrast key aspects for each. In particular we will be reviewing YouView (Canvas), HbbTV, Google TV, SeeSaw and MHEG IC.
Friday, October 22, 2010
Labels: OTT TV
An interesting white paper by Barry Clark.
Source: Sony Professional
Parallell Cinéma was awarded the Dimmy's Award for outstanding interactive stereo 3D at the Dimension 3 Festival for the world's first complete stereo 3D training solution, available now on Convergence3D.
It's compatible with any screen, S3D-capable or not (a specific player has been developed).
This package is making a difference in the S3D world, being the only way for all filmmakers to professionally train themselves to stereo 3D, apart from long and expensive training sessions.
It is extremely complete, covering all technical and artistic changes, from pre-production to post-production. It includes:
- A 2h10min course shot in live-action stereo 3D: we explain a technique, you see the result.
- 8h in a "virtual film studio": an optically-accurate 3D software to test all the techniques in pre-built complex sets.
- 4h of very technical interviews with the S3D experts behind Coraline, Alice in Wonderland, Star Wars 3D, Despicable Me, Monsters vs. Aliens, etc.
MovieScope Magazine (UK) wrote: "The comprehensive DVD set explains all the techniques used in the field, including the most advanced such as non-parallel dynamic floating window, parallax range calculations, adaptation to screen sizes and more. This course gives filmmakers, from independents to long standing professionals, access to the technical skills necessary for the artistic change brought about by stereo 3D."
American Cinematographer online (USA) wrote: "Stereo 3D Filmmaking: The Complete Interactive Course makes the technical skills necessary for stereoscopic 3-D productions accessible to all filmmakers."
Sonovision/Broadcast (FR) wrote: "It's a must-have package"
Parallell Cinéma is also behind the stereo-3D capabilities of FrameForge Previz Studio, the previsualization software used to prepare their 3D shootings by Sony, RealD, 3ality Digital, etc. The "virtual film studio" of the course is based on it.
TV makers are frustrated by the slow release of stereo 3-D Blu-ray disks, one of the factors holding back sales of stereo 3-D TVs. Meanwhile TV makers are trying to cut the relatively high costs of 3-D TVs, another major factor holding back sales.
TV makers had anticipated sales of more than 2.1 million 3-D sets in 2010, but sales trends indicate actual sales will be less than 1.6 million, according to figures from the Consumer Electronics Association. The shortfall represents what could be a billion dollars in sales.
Hollywood studios and broadcasters are under pressure to release more 3-D TV content to perk up sales. So far few 3-D Blu-ray titles have been released for general distribution; a limited number of disks are available sold as bundles with 3-D-enabled Blu-ray players and TVs.
Hollywood studios see a big market for Blu-ray 3-D titles, but they are holding back general releases until there is a bigger market of players and TVs. "We haven’t reached a critical mass in homes" to broadly release 3-D disks, said Jim Mainard, head of production development/technology at Dreamworks, one of the studios who have pushed the move to 3-D TV.
It's a chicken-and-egg situation in which TV makers feel they have the disadvantage, claiming there is not enough content to drive sales of systems.
"There is not enough content right now," admitted Mainard. "Sometime in 2011, you will start to see a lot of these titles free up, not just from Dreamworks but a lot of other companies," he said.
"There will be enough content in 2011, you will see a 3-D Blu-ray section in your local DVD stores next year," he said. "I would say we are probably a year away—it's not this holiday season, but the next one before we start to see any pick up here," he added.
A Sony studios spokeswoman said her company and others have put "a number" of 3-D Blu-ray titles into general release online and at retail. "A number of major studio Blu-ray 3D releases are coming on November 16 and more will be released before the end of the year" including five Sony titles, she added.
Meanwhile live sports content is moving ahead, but faces many challenges. It takes as many as 200 people and an entire separate production flow from cameras on up to record a sports event in 3-D.
"We are spending more than twice what it takes to do normal 2-D HD event" to capture a sporting event in 3-D, it’s a huge financial commitment," said Bryan Burns, vice president for strategic business planning and development at ESPN which is leading the charge in 3-D sports.
"We are doing a game a week and struggling to get there," he said, projecting broad adoption of 3-D TV sets could take three to five years.
Burns called for the industry to support ESPN's efforts. Consumer electronics companies are said to be providing ESPN equipment for free or at low cost to help jumpstart the need for content.
Mainard of Dreamworks suggested the studio has algorithms and other technology to enable broadcasters to share some tasks across 2-D and 3-D production equipment.
Cutting 3-D TV Costs
While TV makers lobby for more content, they also realize they must cut costs of 3-D TV sets. The sets current carry a premium of about $400 for stereo 3-D, about $300 of that for displays with 240 Hertz refresh rates seen as a baseline for 3-D video quality, said David Naranjo, director of product development for Mitsubishi Electric's TVs.
A video interview with Naranjo is available online.
3-D glasses also represent a significant cost. Today TV makers pair proprietary displays and glasses that cannot be used with other vendor's TV sets.
The CEA is defining a standard protocol for 3-D glasses, initially over infrared. It will also develop versions of the protocol for RF links such as Bluetooth. TV makers hope to field such glasses late next year as one way to ease the costs consumers pay for 3-D.
Getting rid of glasses is not an option for the short term. Although glasses-free technologies have been developed for small screens of 20 inches and below, they are expensive, have narrow viewing angles and can degrade 2-D video, said Dan Schinasi, a director of HDTV product planning at Samsung.
"The lenticular and parallel barrier technologies used today are dead ends," said Mainard who has a long background in stereo 3-D starting at TRW. "The industry could field glasses-free 3-d TVs "in 5-10 years, but they will be based on a different technology," he said.
TV makers will be challenged to reduce costs in the near term. They are trying to put both 3-D and Web connectivity into their latest high end sets, both generating more complexity and cost.
Despite the shortfall in 3-D TV, consumer electronics overall still appears to be on track for three percent growth this year, said Steve Koenig, director of industry analysis at CEA. Thank Apple and Amazon who are driving new categories such as tablets and e-readers.
Tablets are expected to generate $4.3 billion in revenues in 2010 and e-readers are forecast to deliver another billion in sales this year. "If tablets are 40 percent stronger than we expect it would more than compensate for the shortfall in 3-D TV," said Koenig.
Nearly 75 percent of consumers in a recent poll said they plan to buy CE gear as holiday presents in 2010, a record high generating hope about a strong finish to the year, he said.
By Rick Merritt, EE Times
KDDI R&D Laboratories has developed the world's first technology for synthesizing and displaying 3D video of stadium sports in real time, from any angle chosen by the viewer. This technology was exhibited at CEATEC JAPAN 2010.
"Until now, the viewing angle for live soccer has only been changeable by switching between camera positions. This technology provides a new video experience, by offering views from angles where cameras can't be installed, such as the middle of the pitch or in the air."
"This screen shows the stadium as seen by eight HD cameras, and these are synthesized pictures from places where cameras aren't installed, including aerial views, and shots from in among the players. On the other hand, this screen shows pictures from a single high-resolution camera, of the sort used for 4K rather than HD. This system creates pictures from any angle chosen by the viewer. Because the camera has such a high resolution, there's relatively little loss of picture quality even if it zooms right in among the action."
This technology uses a unique method to enable high-speed estimation of players' positions on the field in 3D. It utilizes the fact that when soccer is broadcast, the image consists simply of players and a background.
"What viewers mainly want to see is the players. So we thought of technology for synthesizing pictures that look natural, by cutting out just the parts with players, and putting them against a plane. Then, the plane is transformed, in line with the direction or angle the viewer has chosen. The pictures you're seeing now are being synthesized in real time using a single PC. So if you have a PC at home, this kind of 3D processing can be viewed immediately on a 3D TV."
KDDI has also developed server technology that stores stadium 4K video content taken from many angles, then extracts and delivers pictures at high speed with low processing load. This achieves a real-time delivery system for pictures from any angle chosen by the viewer.
Source: DigInfo TV
Thursday, October 21, 2010
Labels: Virtual Cameras
ISee3D announces the availability of its breakthrough single lens 3D technology designed to usher in a new era of 3D. Existing two lens technologies suffer from cumbersome equipment, calibration issues, complex post-production work, and high costs. ISee3D removes each of these barriers by allowing users to capture 3D video through one lens.
This advancement brings significant advantages to 3D capture, eliminating many of the problems inherent in current standards of shooting 3D. For the consumer market, it presents the first viable option to put natural, high quality 3D into existing devices with a camera. Cell phones, digital cameras, and camcorders can all utilize the ISee3D optical process to allow users to shoot and produce 3D with the ease and speed that they currently associate with 2D.
New State of Single Lens – Better, Faster, Cheaper
ISee3D’s unique optical process simplifies 3D capture in many ways. Here are some elements of the process:
- Optical Switch: The original patents describe a method of capturing stereoscopic images by occluding the left and right half of a lens in sequence, which in essence moves the ‘center’ of the lens. This shifting center allows the capture of different perspectives through a single lens. These separate perspectives can then be fused together to create a single stereoscopic (3D) image.
- Perfectly Matched Image Pairs: Because both images are coming through one lens, the image pairs are always perfectly matched. Vertical, horizontal or rotational misalignment are non-issues with ISee3D enabled technology, and unlike other 3D capture methods, focus / zoom are as easy and natural in 3D as they are in 2D.
- Scalability: Because ISee3D’s technology scales across device sizes, it can be applied to many different industries from consumer electronics to car manufacturers, from health care to security. In each case, the optical switch can work within any device from endoscopes and cell phones, to the Hubble Space Telescope — making it possible for scientists, doctors, independent film-makers and average consumers to capture in 3D.
ISee3D is driving the universal adoption of 3D single lens capture – from consumers and Hollywood to health care and military. The company is introducing the first commercially feasible single lens, single camera 3D capture technology. Scalable in size from one mm in diameter to more than over 250 mm, the patented technology can be applied across many devices.
Enjoying S-3D content today means going to the local Cineplex or shelling out for a 3DTV and 3D Blu-ray player or Set-Top-Box, and enjoying the handful of titles that are available. PC gaming has offered another avenue to S-3D enjoyment, and compared to a 3DTV and Blu-ray player, a 3D monitor for a PC is relatively affordable. But not everyone is big on gaming so what to do?
One possible answer is enjoying 3D content on your 3D PC. There are lots of 3D videos on YouTube (search using: yt3d:enable=true), but to watch them using anything other than anaglyph, you must first download them to your local hard drive and then use something like Stereoscopic Player to watch them. This works great, but it is not efficient.
Another solution in the works is from NVIDIA, who is looking to expand their brand into the consumer video with a portal; 3D Vision Live. The portal is officially ‘beta’ and the company is soliciting feedback as to what people are wanting from the site, although the first question has already been answered: Is the web a viable medium for distribution of 3D content? Yes, according to NVIDIA.
When you visit 3D Vision Live, the site automatically determines if you have NVIDIA 3D Vision installed on your PC. If so, your screen will blank momentarily as they turn off the ‘Aero’ feature in Windows (which seems to be a common requirement with S-3D software), your IR Emitter is turned on, and voila, the video in the middle of the screen begins playing - in a window - in stereoscopic 3D.
We know that DirectX doesn’t natively support Quad Buffered Stereo like OpenGL, because DirectX does not have a stereo API. So NVIDIA is using their own API for now (Windows 8 is expected to have an S-3D API). The video player is Microsoft’s Silverlight, which combined with some custom code from NVIDIA allows the player to communicate with the 3D Vision driver. This combination unpacks the video (either Top-and-Bottom or Side-by-Side) and renders it out frame-sequential. All rendering is done on the client PC.
Silverlight seems like an unusual choice for the streaming technology as Adobe’s Flash enjoys dominant player market share, especially with all of the YouTube content. NVIDIA can use Flash for this application, and indeed showed a Flash-based demo of 3D Vision Live at NAB streaming 3D content from YouTube. When it came time to launch 3D Vision Live, however, Adobe was not as aggressive as Microsoft in rolling out 3D support. Adobe is planning to support S-3D in a future release of Flash, but Microsoft was willing to provided resources to work with NVIDIA to get S-3D streaming to work in this beta environment.
NVIDIA has plans to expand 3D Vision Live with updates planned for later in October, more updates in November and yet another round of updates just before CES in January. They are planning to improve content and features, such as DRM support, off-line viewing, support for Flash and HTML5, support for YouTube 3D videos, more content from movie and game studios (possibly segregated into channels) and some way to directly monetize this investment, perhaps creating a 3D video marketplace.
3D Vision Live, while still in beta, provides a surprisingly streamlined user experience to enjoy S-3D content - it figures out what you have and plays the content appropriately. For those of us that have used S-3D on our PCs for the past couple of years, this is the first time that it actually works as advertised. Let’s hope that they can get more content so that we can move from the ‘gee-whiz’ phase to actually watching something entertaining.
By Dale Maunu, DisplayDaily
Despite the wealth of attention given to the capture and production side of the 3DTV equation, distribution and delivery have been largely swept under the rug as 3D technology continues to wade into uncharted waters. At the Content & Communications World 2010 this week, leaders from the cable, satellite, and transmission fields gathered to discuss the contribution and distribution challenges that broadcasters and operators face in delivering content from the “venue to the viewer.”
“Most of the time, 3D distribution to our customers is actually very simple,” said Hanno Basse, SVP of broadcast systems engineering for DirecTV. “We only had to make a few revisions in our set-top box software so all of our existing MPEG-4 HD boxes could receive 3D and output 3D. We just had to work around a few graphics problems, but, other than that, we’re there.”
Three Dimensions, Half the Resolution
In order to distribute 3D within their current infrastructure, satellite and cable providers have used the frame-compatible 3D format, which carries separate left and right video signals within the video frame used to convey a conventional 2D HD signal, squeezing them to fit within the space of one image. While this allows operators to utilize existing set-top boxes (with a software update), it also essentially cuts the video resolution in half.
“The reason we adopted frame-compatible was that we could put it on our plant without making major changes to any of the infrastructure,” said Dan Holden, fellow and chief scientist, Comcast Media Center. “The problem is, once you take 3D video and you cut the resolution in half, there is no way to get that resolution back to that full quality that came from the venue. As we look to the future, we want to get to full-resolution 3D, just like everybody else in the industry, but that is a ways off.”
One Step Forward, One Step Back
Although frame-compatible delivery has allowed 3D programming to enter the home, it may have also created a brand-new obstacle on the road to full-resolution 3DTV.
“By trying to get 3D on the air immediately, we have actually created a new legacy format, this frame-compatible format,” said Matthew Goldman, VP of technology, Ericsson. ”For a number of years, even when full-resolution 3D comes out, there will be lots of systems out there that can still respond only to frame-compatible 3D. We didn’t have to change the existing infrastructure with frame-compatible, but, with full-resolution, we will have to.”
A Multitude of Standards
The chief issue in the delivery of full-resolution 3D remains the lack of true standards for contribution and transmission. Broadcasters, operators, and consumer-electronics manufacturers must make a variety of format decisions when it comes to 3D: side-by-side or top-and-bottom, 1080i or 720p, and a host of others. This has made it difficult for the 3D ecosystem to come to an agreement on standards, which are necessary to spark the rollout of next-generation set-top boxes capable of full-res 3D.
“The question of standards and different formats is confusing the hell out of everybody: TV manufacturers, the HDMI consortium, and even us [satellite and cable operators],” said Basse. “This is not helping anybody. It’s definitely not helping anybody sell TVs, and, let’s face it, that’s really what 3D is all about right?”
Unfortunately for Basse and his fellow operators, a set of 3D standards looks to be a long way off. Although an MPEG subcommittee is currently in the process of setting such standards, results are most likely years, not months, away.
“I would guess that it’s probably going to take three years before we’ll see a standard come out of that body,” said Holden. “Then, it will be another year or two by the time it’s deployed to the consumer. So, if we say it’s five years before MPEG can give us a full-res–3D standard or option, there are definitely a few things we’ll need to look at in the interim.”
Is Full-Res 3D Worth It?
Perhaps the biggest question whether full-resolution 3D is even worth all this fuss. Many argue that the actual viewer benefit of full-res is minuscule or even imperceptible.
“I don’t think there is a huge difference between frame-compatible resolution and the full resolution,” said Basse. “I’m not sure that there is any real tangible benefit for the consumer. Whether people are actually going to see the difference between the frame-compatible and the full-resolution formats is debatable.”
Holden concurred: “I’m not totally sure that full-resolution offers a huge advantage over half-resolution frame-compatible. We’re evaluating that in our lab. I don’t know if that means a 10%, 20%, 50% increase in video quality. But, until we actually set some of these proposals or proprietary solutions up for what we’re doing today, I’m not sure there is a real advantage to go there.”
The Holy Grail: Turning a Profit
Although frame-compatible delivery has provided a temporary solution for the distribution side of 3D, financial viability for 3D broadcasts remains a long way off.
“Right now, there are more than a dozen proposals out there on how to do [deliver 3D], “ said Goldman. “But I know all of us can agree on one thing: we have to make this economically viable. That is the holy grail.”
By Jason Dachman, Sports Video Group
Sony's Paul Cameron looks at how and what technology could soon allow us to watch 3D TV from the comfort of our living room:
Friday, October 15, 2010
SBS has launched a pilot 3D channel in the Netherlands which will offer up to 20 hours of 3D programming per day, including material converted from 2D. The 2D/3D conversion strategy is in contrast with the approaches of other pioneering 3D channels including Sky 3D in the UK and Discovery in the United States.
“Some content will be up-converted, and that could be up to 12 hours a day,” explained Josbert van Rooijen, CTO, SBS Broadcasting Netherlands. “We do simulcast of the 2D channel and the 3D channel. On the 3D channel, sometimes there will be a 3D version of the same programme. In some cases there will be separately produced 2D and 3D.”
SBS is already using tools such as the JVC IF-2D3D1 Stereoscopic Image Processor, which works as a 2D-to-3D converter. “We are looking for other systems as well,” reported van Rooijen. “Conversion is a necessary element of the experiments. It is something you need to learn. I think if I look into, for example, 3D sports, I think it will always been assisting in producing 3D.”
For 3D production, SBS has acquired Panasonic’s new AG-3DA1 twin lens Full HD 3D camcorder, which will be used for instance to shoot original 3D programmes such as a talk show about astrology.
For 3D on-air graphics, SBS has installed the first VidiGo Live 3D which was launched at IBC. “We are already using their system in 2D production,” van Rooijen reported. “3D is something they embraced right away and they used their technology to get it on the road really quick.
“We have produced just one 3D on-air promo to go through the learning process,” he said. “The same with IDs and commercials.”
By Carolyn Giardina, TVB Europe
Japanese projector developers are set to showcase new technology that makes 3D images viewable without the need for cumbersome glasses. Researchers at Keio University are preparing to unveil the projector to the public at the Digital Content Expo 2010 being held in Tokyo later this week.
Lead developmer, Professor Susumu Tachi, said the technology uses special reflecting equipment to project an image from 42 different viewpoints simultaneously, creating a floating 3D image in the middle of a case about 50 centimetres square in size.
Furthermore, by using a camera to track the position of the viewer's finger, which has a belt strapped onto it that expands and shrinks as appropriate, the development will give viewers a feeling that they are touching and moving the objects. In an advance press demonstration on Monday, an image of a girl cartoon character was projected and moved around by the demonstrator.
Professor Tachi said, "In the future, we will be able to use the technology for such applications as virtual exhibits at museums and shopping catalogues."
RePro3D is a full-parallax 3D display system suitable for interactive 3D applications. The approach is based on a retro-reflective projection technology in which several images from a projector array are displayed on a retro-reflective screen. When viewers look at the screen through a half mirror, they see a 3D image superimposed on the real scene without glasses.
RePro3D has a sensor function to recognize user input, so it can support some interactive features, such as manipulation of 3D objects. To display smooth motion parallax, the system uses a high-density array of projection lenses in a matrix on a high-luminance LCD.
The array is integrated with an LCD, a half mirror, and a retro-reflector as a screen. An infrared camera senses user input.
By Nick Thompson, Projector Point
In a 3D live streaming event on October 10, 2010 on 3DF33D.tv (see recording here), General 3D announced the world’s first web-based 3D stereoscopic system to stream 3D stereoscopic videos using only a browser. This new system uses the HTML5 and WebGL standards being built into Mozilla FireFox, Google Chrome and Apple Safari. This new web technology makes possible sophisticated computer graphics without browser plugins or downloads.
The General 3D technology based on HTML5 and WebGL opens up an entirely new category of technology for viewing 3D stereoscopic content including video, still photography, and computer graphics with a wide range of applications in communications, broadcasting and gaming.
General 3D’s 3DF33D.tv service will allow members to easily stream 3D stereoscopic content such as videos and still images. This year, General 3D will also offer the capability for 3DF33D.tv members to broadcast live 3D stereoscopic streaming video using a variety of stereoscopic camera systems. As the 3DF33D.tv live streaming service comes on line, 3D webcams will be offered to members at a special introductory price through the 3DF33D.tv store.
After an easy 3D display setup on 3DF33D.tv, members will be able to upload 3D stereoscopic images and videos and participate in discussions related to 3D. In addition to member-uploaded content, 3DF33D.tv will be ready in the near future to host short-form and long-form video content such as movies, television shows and webisodes.
General 3D is creating a system to support 3D stereoscopic advertisements on the web. Advertisers will be able to engage viewers with many new forms of ads, all in stereoscopic 3D, enabling a new dimension of connection with viewers. General 3D is bringing a new level of excitement and engagement to advertising. General 3D’s expert 3D advertising producers can create 3D commercials for you and can also train your content creation staff in the art of 3D commercials.
The General 3D technology will use the open video standards including WebM VP8 and OGG Theora for video, which are both unencumbered by known patents and can be used freely without incurring licensing costs.
General 3D technology will support a wide range of 3DTVs and 3D displays. Multiview autostereoscopic (glasses free) display support will be coming out in December of this year. The 3DF33D.tv player has been tested on Samsung, Sony, and Panasonic 3DTVs using shutter glasses, and Acer notebooks using polarized glasses and will operate out-of-the-box on many other existing 3DTVs, 3D notebooks and 3D display systems.
To view stereoscopic 3D videos on the 3DF33D.tv system, you need a supported 3DTV or 3D display (with glasses), a computer and a free member account on 3DF33D.tv. In the future, General 3D will also offer a set top box with advanced functionality, eliminating the need for a computer.
3DF33D.tv will build its video library starting with stereoscopic video at 360p, 540p and 720p (per eye). The resolution of video playback will depend on available bandwidth at the point of playback. Most people with high speed internet (and the required 3D display and glasses) will be in good shape to have a very enjoyable 3D stereoscopic experience.
Source: General 3D
French independent HD channel myZen.tv has taken advantage of MIPCOM in Cannes to present new innovations in 3D. Dedicated to health and well-being, the channel was launched in HD three years ago by group Melody headed by Bruno Lecluse and is carried on cable and IPTV platforms in France as well as into 40 countries abroad.
Now, in partnership with Grand Angle and Convergence Images, the channel has started producing contents in 3D.
Beyond 3D image, myZen.tv has also established a partnership with company a-volute to distribute 3D sound in a new technical standard that operates for any kind of speaker, including a PC. The channel is even also testing a system for diffusing odours related to images.
With these innovations, said the channel, myZen.tv wants to “propose an ever more astonishing and deeper relaxation to its viewers, using several senses: sight in 3D, hearing and smell.”
In executing its strategic plan, the group anticipates the launch of a 3D channel, myZen.tv 3D, which could be launched at the end of this year as reported by newsletter Satellifax. Prior to this, 3D VOD tests will be made on operators’ platforms.
By Pascale Paoli-Lebailly, RapidTV News
Sunday, October 10, 2010
Dolby Laboratories, MikroM, USL and XDC together announced an alliance to promote greater interoperability within the digital cinema market. The companies joining in this collaborative effort, referred to as the Digital Cinema Open System Alliance (DCOSA), are working to establish open interface standards for core system components.
Initial goals of the DCOSA are to develop common interface specifications between digital cinema servers and integrated media block (IMB) products and to promote these specifications for use across the cinema market. By introducing these open standards, the DCOSA hopes to drive down the cost of development, to increase flexibility, and to provide a platform for innovation.
"The cinema industry is interested in moving away from having a tightly coupled server and media block available only from a single vendor and moving toward an IMB built using open interface specifications," said Robin Selden, Senior Vice President, Marketing, Dolby Laboratories. "An open platform will increase competition within the industry and encourage cutting-edge development."
"With the rapid pace of advancement in hardware technology, the DCOSA hopes to decrease time to market by enabling flexibility in the choice of core system components. These systems have not been designed for interoperability in the past because of the lack of interface specifications," said Holger Krahn, Chief Executive Officer, MikroM. "Effectively, this initiative will provide exhibitors with more options in the selection of system components."
"As the digital cinema industry continues to evolve, there is an increasing need for well-defined interfaces," said Larry Hildenbrand, Director, Engineering, USL, Inc. "The DCOSA will help to deliver open standards allowing for greater flexibility in system design and enabling exhibitors to create custom installations."
"As more and more playback technology moves into the projector, it is critical that the industry adopt common interface standards," said Jérôme Delvaux, Vice President, Technology, XDC. "We are excited to be a part of the DCOSA, since establishing interoperability will be instrumental in the ultimate success of this market."
The DCOSA plans to make its server and IMB interface specifications available as an open standard for use by the digital cinema industry. The specifications are expected to be completed and available in the first half of 2011. Information regarding specific product availability can be obtained directly from the member companies.
Technical Overview Document
This document provides an overview of the server / IMB interface, including a system block diagram and typical use cases.
Friday, October 08, 2010
Few people question the idea that eventually all video delivery will migrate to IP but it is now looking increasingly likely that all video will migrate to Internet-delivered IP. The drive to deliver multi-screen TV means the television industry is increasingly working with IP streaming and progressive download technologies, while new CDN architectures look set to improve the Quality of Service (QoS) for web delivered content. Now there is a view that the emergence of HTTP for streaming to PCs, tablets and mobiles could provide the basis for a common video delivery approach that will also encompass the set-top box.
It is clear that to make multi-screen TV work as a business, and certainly as service providers scale beyond a few channels of online or mobile content, content management and video processing functions must be rationalised so TV, mobile and online do not operate in separate silos. But for now, content will still be output in the codecs, resolutions, bit rates and transport stream wrappers that best suit their target devices. Nevertheless, the use of HTTP for Adobe, Microsoft and Apple adaptive bit rate streaming solutions, and the warm welcome these technologies have received as a way to improve the Quality of Experience, provides an important common denominator for video in the mobile and fixed broadband environments.
Verimatrix, which provides content protection and revenue solutions for multi-screen TV, has already noted the significance of this change. In its White Paper, ‘Adaptive Rate Streaming: Pay-TV at an Inflection Point’, the company argues that the adaptive-HTTP approach can supplant legacy technologies, that they will alter the current framework of managed network versus Internet delivery for television, and that the two worlds are on the threshold of a rapid convergence. The company predicts that a major shift in the Pay TV business is therefore imminent.
The company points to the historical problems guaranteeing QoS using broadband video technologies combined with the need for Pay TV operators to make their content available everywhere to remain relevant. “The result is an odd situation: TV services, available in a natural IP format for in-home multi-device distribution and sharing, are essentially kept behind a firewall to preserve overall QoS.The traditional guidelines for an acceptable quality, enjoyable video experience can only be provided in the limited circumstances of fully controlled delivery pipes.Yet consumers demand that content flows freely whenever and wherever they want to consume it.”
The answer, according to Verimatrix, is adaptive rate streaming, which eliminates the concept of network managed QoS in favour of a client managed consumer experience. “The delivery technology makes use of what the Web does best – efficient and massively scalable delivery of data using the HTTP protocol.” Now consumers can enjoy the best experience possible depending on their hardware and bandwidth conditions.
“Adaptive rate streaming of video provides an optimum quality viewing experience that scales effectively on global and local networks, makes highly effective use of today’s content distribution networks, and ensures that true HD media experiences over the Internet can become a reality,” the company says.
Verimatrix notes that HTTP-based streaming is particularly suited for the Internet since CDNs already have massive deployments of HTTP acceleration servers. That means they avoid significant CAPEX that would otherwise be required to support proprietary video streaming protocols like RTSP and RTMP, and can instead focus on optimizing and scaling their HTTP infrastructure for video distribution.
The company says: “In the short term, its likely that parallel and separate service backends will emerge that seek to support the OTT demand while leaving existing Pay TV systems infrastructure in place. But this seems like a short-term strategy only, especially with the rising stakes in a rapidly evolving market. A careful analysis shows that adaptive streaming addresses the main business challenges of multiple device delivery and can also successfully supplant the use of legacy technologies.
“The basic argument is that the legacy delivery technologies offer no substantial advantage over the new approach, when considered across the full range of required delivery services and subjective quality levels, and are probably a worse fit to the anticipated trajectory of market demand. In addition, it has become apparent that the drawbacks of legacy technologies are neatly sidestepped by the properties of these new protocols, leading to their eventual obsolescence.”
Verimatrix views adaptive HTTP streaming as a key tool for use even on managed delivery networks, leading the company to conclude that it should become the common approach to video delivery for the set-top box as well as the PC, tablets and mobile, etc.
Earlier this year we spoke to Giles Wilson, Head of Technology, Solution Area TV at Ericsson and he also highlighted the potential role for over-the-top video technologies for ‘traditional’ television. He noted that service providers will be able to use adaptive bit rate streaming across the home network to compensate for the reliability (or lack of it) on home wireless networks but also suggested that if operators are using OTT type protocols for video when it is served to consumers off-net [unmanaged OTT away from home] they could use the same protocols for video served when someone is on-net [for example, at home in a cable operator footprint, connected to the HFC network]. In that second scenario the operator can substantially improve the QoS at home by having their own CDN that is optimised for traditional TV and OTT delivery.
With this approach, traditional TV becomes IP-based and uses the same protocols as over-the-top video and is delivered through a CDN as managed traffic with guaranteed QoS. For an IPTV provider, where video is already IP, the main difference would be that it is delivered with OTT protocols, eventually. This will help get the content onto different types of devices in the home as a managed service.
As we noted in the report, over the next few years the distinctions between the managed network in the home, the unmanaged video network in the home and the unmanaged network outside the home are going to become more important as service providers decide how to balance their multi-screen delivery needs. If Pay TV operators put a high emphasis on the ability to reach consumers away from home and believe in ‘cloud-based’ Internet platforms and video delivery, they may start to rationalise their video delivery infrastructure around Web technologies.
Another major driver in the trend towards managed home television services using adaptive bit rate streaming and HTTP is the likely penetration of connected TV devices. This will occur initially through the CE industry but eventually thanks to service providers offering their own connected set-top boxes and gateways as well. Connected TV is the most significant development in the multi-screen world and it is easy to see how adaptive bit rate streaming, and therefore HTTP, is going to infiltrate the television service provider market at different levels.
An example of the hybrid world we are entering comes from Netgem, which made two announcements at IBC 2010 that outline how rapidly adaptive streaming is taking root. First the company said its Netgem TV middleware will now support adaptive technologies to extend the reach of Pay TV operators, starting with support for Microsoft IIS Smooth Streaming (and Play Ready). Then it unveiled the Netgem N5000 Internet/TV adapter, which is aimed primarily at service providers as a way to help them counter the connected service offers of CE vendors and Google.
The N5000 hybrid device supports RTSP, Progressive download, Apple HTTP Streaming and Microsoft Smooth Streaming. The line between managed and unmanaged video, and between the classic private television network and the Internet, are clearly going to start blurring inside the service provider home and inside the networks.
Clearly service providers can rationalise the delivery of three-screen services with a common headend, unified content management and an umbrella content protection solution that manages all CA and DRM requirements, and still output video separately for television and for PC and mobile. This model looks perfectly viable but the likely success of adaptive bit rate streaming could open the way to even closer convergence of the video delivery environment. For years it has been obvious that the TV market is marching towards all-IP. Now it looks like that will mean Web-IP. Maybe that also means HTTP.
The rapid evolution of CDNs supports the view that Internet TV delivery technologies could start to dominate for all television.
By John Moulding, Videonet
Thursday, October 07, 2010