大象传媒 R&D's website has a selection of project pages describing current projects, many of which come under the . I thought I'd give a bit more background on how some of these projects fit together, and how the work is organised. I lead the 'Production Magic' section of R&D - a team of 17 people that looks after new developments in the area of programme production technology, focusing on audio and video signal processing. All the projects currently listed under 'Production' come from my section.
I (naturally!) think that the work of Production Magic is amongst the most exciting in 大象传媒 R&D because our work usually leads directly to new things that are directly visible or audible to our audience. Some other work at R&D tends to be less visible - indeed, in projects such as video compression, the more invisible it is the better! Production Magic section currently covers four broad areas of work: audio, the fundamentals of HDTV production technology, and two areas looking at particular applications of image processing and computer vision to TV production. I've summarised each area below:
Our audio work, led by Andrew Mason, covers both the development of technology to maintain and enhance the quality of the 大象传媒's current audio output (TV, radio and online), and more forward-looking work into areas such as the recording, transmission and replay of 'periphonic' sound (that comes from all directions, including above and below). The 'Ambisonics & Periphony' project page gives some details on this project, including a link to a video. Other current work includes active involvement in the which is looking at loudness measurement . Many people are surprised to learn that most audio level meters such as the 'VU' meter do not give readings that correlate particularly well with how people perceive loudness; a new metering standard using sophisticated digital filtering methods can produce a much more reliable indication. When used in conjunction with the relevant guidelines, it should avoid the problem of one programme on a channel sounding louder than another. We have produced an .
Richard Salmon leads our work on HDTV Production Technology. One of the main things occupying Richard at the moment is the effect that the move from CRT to LCD displays is having on TV production: It is vital that the production team make their assessments of image quality (including colour and brightness settings) on a monitor with a well-defined specification, otherwise the 'look' of programmes produced in one area won't match that from another area. The (now rather old-fashioned) CRT monitor did a good job, but the new generation of LCD monitors have different colour rendition, and have a range of other problems like changing their appearance depending on the angle at which they are viewed. Richard is working with other broadcasters in the .
Oliver Grau leads our work on 3D content production. Rather than focusing on stereoscopic 3D (systems using two images, one for each eye, as used in 3D cinemas), Oliver's work is looking at capturing complete 3D models of the action, that could be viewed from any angle. One application we've looked at is for : images from multiple cameras covering a football match can be processed to build a 3D model of every frame, allowing a 'virtual camera' to view the action from any location, such as the referee's or goal-keeper's viewpoint. We are developing this technology further in a looking at aspects such as increasing the detail in important areas of the scene by using robotically-controlled cameras to capture close-ups of faces. Another collaborative project, , is looking at new ways of capturing stereoscopic 3D, and we are looking at rendering two closely-spaced views from the 3D models we capture to allow a . This could cut the cost of stereoscopic 3D production for programmes such as sports.
In addition to leading the section, I also look after an area of work concerned with real-time image-based tracking and the incorporation of 3D graphics into images in real time. I've been working in this area for around 15 years, and in that time have worked with my team to develop several new production tools that have been turned into products by licensees. Some of these past projects have web pages tucked away in the , and include , a system for tracking the movement of cameras in a TV studio to allow virtual objects to be inserted into the image such that they appear to be fixed in space, using circular barcoded markers on the ceiling. Examples of its use include the , and (perhaps surprisingly), (the product having been sold to them by our licensee). Other past work includes a system for , which has been licensed commercially and is used regularly on programmes like Match of the Day for effects such as virtual offside lines. One of our current projects is extending this work to allow the tracking of camera movement from features other than lines, for applications such as drawing distance markings on a long-jump sand pit.
With the 2012 Olympics in mind, we are looking at ways of providing more in-depth analysis of athletics by tracking the motion of athletes, including aspects of . Another related project, is looking at ways of 'bringing to life' a 3D model of an area such as the 2012 Olympic Park, by registering live camera images into a 3D model of the area, and 'embedding' them at the correct location. This could allow a user to 'fly' around the area and see what's happening live, through a Flash based or other application in a web browser. We've recently started another project that will allow a user to , rather than using a 3D representation - I'll write a separate blog post about that later.