´óÏó´«Ã½

« Previous | Main | Next »

Universal Control

Post categories: ,Ìý

Steve Jolly Steve Jolly | 11:57 UK time, Monday, 28 February 2011

A few weeks ago we published a White Paper containing a version of the Universal Control API, and announced it on this blog. ÌýWe think the technology is important enough, and will have a wide enough impact, that it's worth discusing it in detail in a way that will hopefully make our aims in this work clear to everyone. ÌýThe relevance of the API to recent posts on this blog about "Orchestrated Media" will also be explained.

Television is going through a pretty exciting time right now. TVs and set-top boxes (STBs) are starting to gain connections to the home network and to the Internet. More and more phones are "smart phones", with good web browsing experiences and the ability to run user-installed applications. Increasingly, they also have WiFi connections that give them access to the home network as well as to the Internet. Tablet devices like the Apple iPad are selling so well that a significant number of consumer electronics manufacturers are now rushing to get competing products onto the market. We think that these advances open up some really interesting possibilities, and I'm going to talk about some of them here.

The first opportunity we see is in remote control. Obviously we're not the first people to consider this. , , , , (and others) all have applications for mobile devices that act as remote controls for other home media devices or software applications. We think that the end user would benefit from a slightly different approach though; one that had all of the following advantages:

  • The device being controlled would not get to dictate the nature of the remote user interface (UI): this would be entirely under the control of the software running on the remote device. I want to be clear here: simulating an infra-red remote control on a smartphone is a useful stopgap, but you can do so much better than that. Why not shift the entire UI to the smartphone, and keep the television pictures free of clutter?
  • The approach would be completely agnostic with regard to the source of the media being presented: it could come from a broadcast television service, an Internet video-on-demand (VOD) service, another device on the home network, stored media on the device itself or some other source entirely.
  • In addition to presenting "linear" media such as audio and video, some media devices are becoming increasingly sophisticated in the "interactive" media they support (from existing "red button" apps on digital television or DVD menus to the web widgets, flash and native apps offered by the new generation of TVs and other gadgets). We need to be able to select and interact with both kinds of media, but interactive media offers up a particularly exciting possibility that I'll talk about some more below.
  • The control of functionality common to many home media devices would be standardised, but the OEMs of those devices would be free to extend that standard to support their devices' unique features.
  • The approach would not be restricted to the control of just one class of device; it would apply equally to televisions, set-top boxes, radios, Blu-ray players - indeed any kind of device that selects and presents media to the user.
  • A single remote control user interface could control many different media devices within the home.

Given the trend for home media devices to gain connections to the home network, we believe that the best way to gain the advantages listed above is via a standardised API: a way for devices to communicate with one another over the home network to share information about and control the presentation of media to the user.

We also think that while everyone would benefit from the existence of a remote UI API of this kind, there is a group of people who would benefit from it enormously: people with requirements, such as the blind, or people with motor dysfunction.

Improving Accessibility

Conventional STB user interfaces (the Electronic Programme Guides (EPGs), menu systems and other ways that users interact with the devices) are built around an infra-red remote control with a multitude of buttons, and visual representations of information that are shown on the television screen. The blind and partially sighted are the people most obviously at a disadvantage when presented with an interface of this kind, but they are by no means the only group of people with accessibility requirements who would benefit from an interface better suited to their needs: for example, people with various degrees of motor dysfunction or cognitive impairment (or indeed people with multiple impairments) are equally worthy of consideration.

One solution to the lack of accessibility in the user interfaces built in to STBs is, obviously, to improve them. With digital switchover in progress, the have been working with partners on this since 2008, and that render EPGs and menu items as speech are now on the market. Devices like this clearly offer a huge benefit to the people they target, but it would be better still if all home media devices could be made accessible, to the greatest degree possible, to people with any accessibility requirement. An API such as the one described above is an ideal means to achieve that.

Orchestrated Media

My colleague Jerry Kramskoy has already posted a couple of posts on what we are calling "Orchestrated Media", a new kind of experience in which multiple devices collaborate. This new kind of experience is another reason for improving communication between home media devices:

Firstly, if the other devices on your home network could find out exactly what you were watching on your television and exactly whereabouts within the programme you were, then content on remote devices could be synchronised to the television. Again, we know there's a demand for this because people are already doing it: companies like and (for example) have products on the market that use a short recording of a programme's sound track taken by a device's microphone as the basis for identifying what the programme is. Clearly these approaches work, but we believe that simply asking the STB or television via the home network for information about what it is currently showing would be more reliable, and significantly quicker and simpler.

Secondly, we believe that the interactive experiences people have with their home media devices are also evolving. Existing "interactive TV" experiences like "red button" applications are gaining Internet connectivity, and a new generation of set-top boxes can present their users with interactive experiences in the form of network-aware . We think that introducing a standard way for applications running on home media devices to talk to applications running on users' mobile phones, tablets, laptops etc would enable a new kind of really rich interactive experience, in which multiple devices collaborate to gather information from, and present things to, the user. For example, a broadcaster could run a quiz show in which everyone in the home can take part on their mobile device, without having to invest in the infrastructure necessary to provide such a service via the Internet.

An Announcement

If you follow this blog, you'll have seen the recent announcement of the publication of a white paper on "Universal Control". That's the name we've given to the API we've designed that has all of the above advantages, and which we are now in the process of publishing as a W3C member submission, as well as discussing directly with interested parties. The white paper we have already published contains the specification for the API, and a companion document will be published shortly giving more information about what it's trying to achieve, and explaining some of the design decisions we took in creating it.

What we've designed is a served from the STB over the home network. We've designed it to be as easy as possible for STB manufacturers to implement, and to have as shallow as possible a learning curve for client developers. We've taken steps to ensure that clients in particular can be written to run on a huge range of devices, from embedded platforms and J2ME mobile phones all the way to applications running on smartphones, tablets and personal computers, running as native applications, web pages or web widgets, or as Flash applications.

Our API is complementary to other home network technologies such as DLNA, which concentrates on interoperable streaming of media between devices. DLNA does not address the issues associated with control of and communication with interactive content. Conversely, our API does not address the streaming with media between devices - but is entirely compatible with the use of DLNA for this purpose. ÌýIn addition, the remote control functionality defined by DLNA does not deliver the kind of interoperable remote control applications that our API enables. ÌýThe DLNA Remote User Interface standard, which also defines a way for home media devices to be controlled via the local network, does not allow for the design of the remote interface to be customised to the capabilities of the client device or to the requirements of the user.

What we've published is implementable today (we have an example implementation running on and a simple web-based client that we plan to release as Open Source software in the near future). More importantly though, it's a starting point for wider discussions. Our conversations with colleagues in other parts of the ´óÏó´«Ã½ and with partners from many parts of the industry have convinced us that the benefits I've mentioned here are genuine, but in order for the license-fee payer to see those benefits, we're going to have to collaborate with a much wider range of people: this publication is just the beginning.

Comments

  • Comment number 1.

    This comment has been referred for further consideration. Explain.

  • Comment number 2.

    I think that accessibility is very important when it comes to manipulation of the TV and its associated devices. However, I still feel that handheld remote controls are perhaps slightly more flawed than most people recognise. For example, someone with motor dysfunction can not be expected to use a handheld remote control to manipulate their TV equipment, and yet they haven't got much choice.

    The question is, how can TV equipment be manipulated without yet another device?

Ìý

More from this blog...

´óÏó´«Ã½ iD

´óÏó´«Ã½ navigation

´óÏó´«Ã½ © 2014 The ´óÏó´«Ã½ is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.