´óÏó´«Ã½

« Previous | Main | Next »

IRFS Weeknotes #108

Post categories: ,Ìý

Chris Godbert | 17:00 UK time, Monday, 21 May 2012

We've been doing some interesting UX research for ABC-IP (Automatic Broadcast Interlinking Project) into alternative approaches to publishing large programme archives. Making archives available online is a costly business that typically involves high degrees of curation by skilled editorial staff. As part of the ABC-IP project we're looking at whether we could publish large archives with less editorial effort by using computer processing and crowd sourcing techniques. We've been using the large World Service radio archive as a test case.

Yves has worked wonders by automatically generating metadata with his speech-to-text and automatic concept extraction software to create tag clouds for each programme, and these form the backbone of content discovery for the prototype we've developed. However, as the automatically generated tags have been generated by computers and not by people, we face a big challenge to improve their accuracy.

Over the next few weeks we'll be running an online experiment with participants from our partners at to explore different techniques for correcting automatically generated tags, and what we learn from this will be fed into the development of future archive projects.

A screenshot of the tagging experiment

In the experiment the Global Minds participants will asked to listen to a short audio clip from a World Service programme. They will then be asked to identify the tags that they find relevant or erroneous, which will affect the weighing of each tag and tells us how useful it is in describing the content. The experiment is designed to accelerate the effect of successive users on the tag so that inaccuracies are corrected in the shortest time possible and then remain stable for subsequent users.

In other project news, a big milestone was reached this week on FI-Content, the project's formal year 1 review in Brussels which comprises presentations of progress across all the different workpackages since the last review. I'm please to say we passed with flying colours and the reviewers were very supportive and encouraging of the range of interesting use cases we are developing. We've also just completed the latest internal project iteration which delivered a Chrome extension to allow us to test some attitudes to user data in more detail. We are currently testing it and getting ready for the start of the user trial which is due to start shortly. In the next phase of the project we will be looking at device authentication.

Vicky has been working with Salford University on recruitment for the Internet of 'My' Things ´óÏó´«Ã½/FIRM research fellow. The ´óÏó´«Ã½ is looking ahead to a world where human-friendly network-aware technology is the norm. In this world people will continue to tell stories. The project is about prototyping a reusable toolset to enable exploration and creation of experiences in that world and potentially defining new standards for technology and usability. Ultimately, the ´óÏó´«Ã½ wants to understand how informing, educating and entertaining audiences could change. This 1 year fixed post is based in Media City UK, Salford. The deadline for applications is Friday, 8th June. For full details, and to apply online please visit the .

On Thursday, Libby brought in her which prompted a small hacking session and after some fiddling with proxies we managed to get live TV streaming. We used , a very barebone Linux-based system directly booting into XBMC. On Friday we got the chance to show a selection of our recent projects to the heads of R&D including the egBox, our TV-in-a-browser prototype. Yves has also been reviewing for the , and starting to prepare for the conference in two weeks, where he will be presenting a paper on ontology evaluation, contributing to a panel, and chairing the In-Use track.

Chris L attended the and saw some people doing interesting things with music and the Web Audio API. Vicky went to a conference about the , hosted by The Creative Exchange. There were some interesting speakers like Richard Harper from Microsoft Research, Neville Brody at the RCA and our own Bill Thompson from ´óÏó´«Ã½ Archive.

Finally, we had some really interesting visitors this week. The Snippets team are working with the and invited them in to showing their scene and object recognition and visual search. Amazing. And at our team meeting, , from Goldsmiths College talked about Daphne Oram and his . He showed , a machine-learned visualisation of her unclassified music archive and , a generative music app. Brilliant.

Some interesting things we've found this week:

  • Theo's been looking for design inspiration.
  • An interesting link about who are trying to standardise APIs for accessing device hardware.
  • Useful link about preparing fonts and graphics for the web so that they're .
  • you can download and use in whatever way you see fit

    Comments

    Be the first to comment

    More from this blog...

    ´óÏó´«Ã½ iD

    ´óÏó´«Ã½ navigation

    ´óÏó´«Ã½ © 2014 The ´óÏó´«Ã½ is not responsible for the content of external sites. Read more.

    This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.