´óÏó´«Ã½

Archives for April 2010

Prototyping Weeknotes #12 (30/04/10)

Post categories: ,Ìý,Ìý

Chris Godbert | 12:51 UK time, Friday, 30 April 2010

Monday: Louise, the R&D librarian, comes in to explain their wonderful library system, how to order periodicals and abstracts, see our (small, very tech focused) book shelf, and introduce herself to our team. We take some colleagues from central FM&T through a prototype we built last summer - good to see the learnings are still useful. Tony explains his proposal for his project to be done over the 6 months he's attached to us - a cheap remote camera which could solve a real challenge being faced in production. There are a couple of major questions to answer before he can start scoping it, but it sounds interesting and is definitely a prototype.

Vicky, George and I are hard at work on the work plan for our section, to show R&D management this week. Chris N has made a couple of final changes to Coventry to get it deployed, so this should be ready to go live soon. Look out for a detailed blog post on the project. Chris B has done some work on fetching and displaying on-demand video for the multi-touch EPG work.

On Wednesday Ben, a final year student on the University of Northumbria's Industrial Design MA, came in to show us his project work on an intuitive TV remote control system. We grill him on some of the issues his work raises and see his physical prototypes. All good stuff. Mel from the UX team comes in to see our new multi-touch prototype in the afternoon, then it's back to the work plan.

We're making good progress looking at inbound links on the micro-blogs project. Sean's prototype is revealing some interesting results (especially watching Gordon Brown's mid-week comments zoom up the chart) which is being shaped into a simple information presentation. Chris N has started refactoring the ingest code to support the prototype.

Vicky and Theo have been out and about this week meeting lots of interesting people: Pierre in Vision who's working on a project about public artworks; Rob who's doing some work on Redux with the R&D Audience Experience Team; and a really exciting meeting with Nick Durrant and Gill Wildman from to discuss their storytelling approach to service design. We hope to continue that conversation soon.

Sarah from the R&D press office is working out of our office on Thursday. Always good to see her, and we go through possible plans for a recent prototype, and discuss the concepts and possible press angle. We submit our slides for the big section review meeting tomorrow. Duncan has finished off the Dashboard spike and has integrated Anthony's work on train times; he's also started writing a document to understand what we would want from a Redux API. Next week he'll be starting the quiz sprint with Sam which is part of our June deliverables for the project.

It's Friday morning and we're doing a final run through of our presentation for our big meeting this afternoon. Chris B gives us a demo of what he built yesterday; a simple Flash and PHP-based audio ingest client-server chain for a wider R&D VoIP project (looking at audio contribution over the web) - amazing work for one day! No stand-ups this week (we've been distracted with lots of longer term thinking) but everything is on track; it makes me wonder whether we need them every day. Hopefully it will be more settled next week and we can put the focus back on our current projects.

A Touch Less Remote: Part 6 of 6

Post categories: ,Ìý,Ìý

Vicky Spengler Vicky Spengler | 10:40 UK time, Friday, 30 April 2010

The ´óÏó´«Ã½ R&D Prototyping team has been investigating how multi-touch software could support television viewing in the future. In this final blog post Maxine Glancy and Connor Graham summarise what users made of the prototypes and draw some conclusions to assist in the design of future projects.

As covered in previous posts, we developed two prototypes we wanted to test. The first application supports the planning of an evening's viewing and the second builds on the idea that a group of people in a living room might like to interact with a television programme together rather than individually.

We included questions about the prototypes in home visits we organised for a wider study looking at usage of second screens in the home. This meant we could get a real sense of whether these new applications would meet users' expectations, to see if and how they might fit into the lives of the participants and their families, and to determine what the impact of their introduction might be.

We followed up the home visits with a focus group involving some of the household members we visited. At this group session we firstly asked participants to discuss their agreement and disagreement with particular statements about the demonstrations drawn from the home visit. We then asked them to use the demonstrations in a guided way and had a discussion around some key points.

With the first prototype the group identified some critical issues that would be need to be addressed in a more fully developed application. They didn't fully understand the way 'sources' of recommendations had been presented, were unsure how they would refine a schedule once it was running, and were uncertain how one person's preferences related to and affected another's.

To fix these issues there needs to be clearer labelling of 'sources', functions added to pause or override the recommended playlist, and a way for one user to take priority over another as this is the reality in most homes! On a more positive note many found the idea that their viewing habits and preferences could be captured appealing and this was encouraging.

With the second prototype there were, in our opinion, no critical issues. People generally understood what was happening. What is interesting though is that our participants were far more interested in the idea of multi-touch applications targeted at individual consumption and from this evidence we believe these are more likely to gain acceptance in the home.

In the multi-user game format we tested you can compare your own collective scores with those of another home, such as a friend's, and with an overall national average. This stimulated discussion about the place of the household in the national context. Our participants would be far more interested in playing against people they know than comparing their scores with the rest of the country.

They participants also felt it was essential that their interaction counted for something so in the case of an Apprentice game that the votes contributed to the outcome of the programme. Simply playing along with a programme is not enough.

While the participants enjoyed playing with the multi-touch device there was an overall resistance to it. This was partly at the idea of another piece of furniture in the home but also because they assumed the multi-touch display would be incredibly expensive.

What we found most interesting from the study comes as two related thoughts. Firstly the fact large screens are normally situated in landscape format enforces a way of interacting with them as people position themselves in particular ways. There is clearly scope for developing screens that allow interaction from any angle. Secondly several participants seemed to want the applications to support individual as opposed to group consumption of media, possibly on a separate screen.

There is an opportunity here to start with the social richness of particular situations and design back from these. The multi-touch experiences should be built to draw on as opposed to enforce particular arrangements of people. It's also really important that these new technologies support and complement the use of personal devices that are becoming increasingly prevalent such as smartphones. This is an area we're very excited by and will be exploring further.

Prototyping Weeknotes #11 (23/04/10)

Post categories: ,Ìý,Ìý

George Wright George Wright | 16:24 UK time, Friday, 23 April 2010

A big start to the week - I meet my boss and business manager to discuss our resourcing, output and plans, then go straight to a department wide conference where Tris and I present our recent work on Mythology. We're just small fry - Mark Thompson, the ´óÏó´«Ã½ Director-General is speaking, Erik Huggers (the Director of Future Media and Technology) is explaining his vision for the department, and it's all in front of a studio audience, filmed and broadcast on the ´óÏó´«Ã½'s ringmain internal TV network. It goes well, and we're pleased to be straight on after Tony C and his team presenting their Ambisonics work. I have a natter with some familiar faces from the audience and then hurry back to work.

Chris B from the ´óÏó´«Ã½'s News Interactive team comes across on Tuesday with some suggestions for new work, and Duncan's got the Dashboard up and deployed, and demo modules running against public APIs. He's been working with Tony C, our trainee, explaining some modern approaches to software engineering (as well as some sneaky tricks of his own)

Our former sysadmin, Matt P came to work with us for the afternoon, and goes through some technical questions with Chris N - how we set up new VMs, creating a dedicated MySQL server, Nagios monitoring, etc.  Sam's still working through our P2P-Next backlog: decentralised search, portal feeds and planning for LIMO prototypes. We have a quiz sprint on this planned in April, so Dom sent a mail to the P2P-Next project asking for input into the use cases for this.

On Wednesday, we got got the new version of URIplay up and running - for the meantime we're still serving the Totem podcast feed from a static file (updated every 30 minutes), but we managed to squash some bugs in the Ubuntu bugtracker, and submitted these to people from Canonical. Chris, Vicky, Tris and I have been refining our project plan for the next 6 months to present to R&D management next week. It's getting there but still needs work. People from Vision come to see us with user stories and requirements for Buzz tracker to see how some of our recent research can help.

On Thursday we had an initial meeting to decide the scope of the "micro-blogging use case 3" project: identifying most popular links to ´óÏó´«Ã½ pages, and Chris N spent the day looking at the ingest code to understand how it works. Sean and others then get all the components of the system for analysing links up and working.

On Friday, Theo has useful meetings with Childrens, Games Grid and  Vision to see what they're up to (with a view to future collaboration.)

I do lots more prep for next week and then get to experiment with a new commercial (but still experimental) multi-touch device that has arrived. I'm also prepping RadioDNS evaluation to explain to people next week, then it's hometime soon.

Prototyping Weeknotes #10 (16/04/10)

Post categories: ,Ìý,Ìý

Chris Godbert | 15:32 UK time, Friday, 16 April 2010

We're starting a new project this week, Dashboard, which we're planning to iterate on for the next ten days. It's primarily Duncan and Tristan working on it, and with such tight timescales features are likely to be quite minimal initially. Theo and Vicky are starting to scope out the Outreach project, it's really exciting but seems a little daunting. Akua, Sam and Sean are back in the office so it feels a bit more like a normal week. The advert for our vacant has gone out.

George is in Manchester interviewing people for R&D Industrial Trainee roles, meeting with R&D North people, and catching up with some of his old TV Platforms colleagues. He's impressed with the new R&D North Lab (and the beer prices up there).

Tuesday begins with a lengthy discussion about Microblogging use case 3, which is looking at inbound links to the ´óÏó´«Ã½. Sean is going to do a quick tech spike to clarify a few things before we integrate with our ingest chain. Vicky is going to start working up some possible visualisations. Chris N and Theo are in W12 learning more about triple stores and SPARQL. Vicky, Tristan, Theo and I have another go at the Themes board. We make some progress but hit a wall; we need to come back to it with fresh minds.

As part of the Dashboard project,ÌýDuncan has set up a simple Rails template for our projects that just creates an empty vanilla Rails app and supporting files in our preferred configuration. It's a first pass but should save us time during project start up. We find out that we'll be presenting the Mythology project at the all staff conference on Monday so Tristan has some extra work to do.

On Wednesday we continue working on our themes for next quarter; George adds a couple of new ones. It's looking like we'll have 4-5 main areas of work for the next 6 months or so which feels about right. Chris B has finished tidying up the loose ends on Coventry. Tristan's been checking it daily and whilst there are still a few anomalies in the data, it's definitely good enough for a prototype. The guys from the Music team come down for a demo and are really pleased so we'll release it soon.

Tim C from Audio & Music comes down to talk about events and what social features they plan to use; we conclude not to continue with our events use case for Microblogging. It's disappointing but we've gained some valuable learning. I guess that's why we try to use an agile approach, fail fast if you're going to fail. On the plus side, Sean is now successfully scraping and expanding ´óÏó´«Ã½ links from the update stream. Fritz, who worked on our multi-touch prototype, is also in this morning to handover his work to Chris B and Duncan. More visitors in the afternoon from Vision to talk about multi-touch, second screens and synchronous quizzes. George is again interviewing for Industrial Trainees, this time in London.

So it's Friday; Tristan is writing up the Coventry project ahead of the release and blog post, George is meeting with Peter Brightwell, Phil Tudor and others from R&D to talk about the Ingex project and possible collaborations, Chris N is attending an internal QA day in W12 and the Dashboard has it's first data on it - the ´óÏó´«Ã½ bus times! It has been hectic again this week, but that's how it should be.

A Touch Less Remote: Part 5 of 6

Post categories: ,Ìý,Ìý

Vicky Spengler Vicky Spengler | 18:08 UK time, Tuesday, 13 April 2010

The ´óÏó´«Ã½ R&D Prototyping team has been investigating how multi-touch software could support television viewing in the future. Creating software for this emerging technology presented a series of design challenges, in particular when looking at how software could be used to help plan TV viewing for that evening.

We built our multi-touch table to develop ideas and prototype concepts that can support multiple user profiles in a way that current TV remotes can't.

Although the size of the surface is well suited to a shared experience, developing an interface for multiple users was a design challenge in itself. The application needed to simultaneously support personalised elements in a shared interface and identify who owns which content. Our solution is that each user that comes to the table has an avatar and colour coding to show ownership of content.

wide_view.jpg

Another challenge was to present users with a simple interface and a manageable amount of information, requiring only minimum effort to find and consume the programmes they like from hundreds of channels and on demand programmes available.

We were interested to test how users might feel about creating a personalised schedule based on their preferences and past viewing habits, and tweak it on the fly depending on their mood that evening.

sources_CU.jpg

If they are in the mood for a comedy programme then they can give more influence to that source of recommended programmes by stretching the object labelled Comedy. If users see a programme they don't like, they can remove it by pinching it until it disappears. Users can see how their interventions impact on the schedule as it dynamically adds, removes or shuffles programmes in front of their eyes before they commit to it and play it out on the primary display.

We opted for a slight variation on the 'pinch to shrink and spread to enlarge' design pattern, which is used to increase or decrease the size of objects on screen. In the interests of rapid development, it was much simpler to only increase the objects on the horizontal axis.

programme_info.jpg

A single touch and hold on an object is used to view additional information, like the time that the programme is available to watch or if it is on demand (available now). It was particularly important to use quite simple gestures, as the type of silicone layer we used was not responsive enough for anything much more sophisticated.

For this prototype we felt it was important to strip back the visual layer and focus mainly on the mechanics of the interface. Again, for ease of use there is only one major function being performed by the system, even though it is able to detect multiple users.

Overall we believe we've solved some of the fundamental design challenges with how multiple users could interact around a multi-touch device while watching television. In the next blog post we'll be summarising what users made of the prototypes and drawing some conclusions to assist in the design of future projects.

Links

Multi-Touch Systems that I Have Known and Loved  

Weeknotes #9 (08/04/10)

Post categories: ,Ìý,Ìý

Tristan Ferne | 16:11 UK time, Friday, 9 April 2010

We start the week scratching our heads a bit about the Microblogging project. Vicky and Theo meet about an archive project proposal which is shaping up to be an exciting, meaty design challenge.

Chris B adds a fix for the duplicate artist problem in Coventry and later in the week Chris N finishes the monitoring code. That means the main work on Coventry is finished, we’re just going to leave it running over the next week and monitor its progress. Then all that’s left are a few deployment things once we get the go-ahead to publish.

Chris B has also started some 20%-type work on a multitouch programme guide, it’s pretty inspiring and the design team all want to join in. Meanwhile Theo is working up ideas for his knowledge sharing initiative.

It’s Thursday and we have enough people in the office to have our first stand-up of the week. We’ve got some external blockers on the Microblogging project but otherwise we’re OK. We follow that by making a first cut of the future project ideas. After some frantic last-minute additions we get rid of a few ideas immediately, leaving a more manageable number for detailed review. We’re aiming to get this down to about six really good project ideas by the end of next week. And Glen, our lead engineer, is leaving us on Friday. We give him his coffee-related presents and head off to the pub for a farewell drink at the end of the day. Bye Glen.

Friday afternoon; I’m writing this, Chris N is updating our build of URIPlay, some legacy work, and Vicky is writing another blog post. George is preparing for a set of interviews next week with potential new recruits to R&D.

People we met this week: The new team leader for TV Platforms, a VIP from Vision who’s interested in Mythology, ´óÏó´«Ã½ Radio people to discuss RadioDNS, Jason from mobile who has a great idea for the Microblogging project, R&D sysadmins to discuss migrating our servers and Caitlin from Strategy to chat about what we can do for them and they can do for us.

This week our work was disrupted by Easter and road digging just outside our window.

A Touch Less Remote: Part 4 of 6

Post categories: ,Ìý,Ìý

Vicky Spengler Vicky Spengler | 09:47 UK time, Tuesday, 6 April 2010

The ´óÏó´«Ã½ R&D Prototyping team has been investigating how multi-touch software could support television viewing in the future. This article, Written by Fritz Solares looks at the hardware and software behind the prototypes.

The multi-touch table we used was built for us by the ´óÏó´«Ã½ R&D workshops as there wasn't anything commercially available that did exactly what we needed. Most devices described as multi-touch only support three or four simultaneous touches and we wanted to have several people using it at the same time. On the flipside most devices available that are truly multi-touch are extremely expensive and use closed technology that doesn't lend itself to rapid prototyping.

hardware.jpg

There are several ways to handcraft multi-touch tables that are well documented on the web. Ours uses the 'frustrated total internal reflection' (FTIR) technique. FTIR devices are cheap as they can be built with consumer items (a webcam, a projector and an out-of-the-box PC) and theoretically they support as many touches as the number of fingers you can fit on the screen. One downside is the devices are bulky because of the distance between the screen and the projector.

For the screen our device uses a sheet of tough transparent plastic covered in a layer of silicone and then a layer of thin translucent sticky-backed film. Strips of infrared LEDs surround the plastic creating light that shines across it and is reflected within it until the surface is touched, the reflection is broken, and the light is scattered is down out of the sheet. This is much the same effect as when you can see your fingerprints but not much else through a full glass of water.

A webcam with an infrared filter is mounted below the plastic to pick up the scattered light and this is where the software takes over to makes sense of the image it receives. We used the CCV application which takes the video stream from the camera and outputs touch data as a series of coordinates using the TUIO specification, an open framework that defines a common protocol and API for tangible multitouch surfaces. Some of the benefits of CCV are that it is open source, cross platform, quite robust and easy to use. It can be used with many different types of touch lighting techniques including FTIR.

The prototypes themselves were developed in Flash using the ActionScript 3.0 (AS3) language. We have a lot of Flash experience in the team and it's really quick to import new graphics and fonts which makes it good for rapid prototyping.

We used the Touchlib library that listens to the touch date broadcast by CCV and turns it into standard AS3 events that can be used within new or existing applications. Again Touchlib has the advantage of being open source and there are quite a few examples available which makes it easy to get up and running, Unfortunately there isn't much documentation and there is little support built in for higher level interactions such as gestures which means these need to be coded from scratch. We ended up building our own very simple multitouch application framework as a basis for developing our prototypes.

From a technical point of view the project ran quite smoothly although we had a couple of specific design constraints to overcome. Because of the silicone used we found the table surface was not as responsive as we would have liked so as a work-around we built applications with only the simplest of gestures like touch and drag. We also had limited time and this gave us another good reason to stick to simpler interactions. For any future prototypes we would hope to change the surface to something softer and more flexible.

In the next blog post we'll be looking at some of the design challenges in developing these prototypes in more detail.

Links

CCV  

TUIO  

Touchlib library  

Weeknotes #8 (2/04/10)

Post categories: ,Ìý,Ìý

George Wright George Wright | 17:14 UK time, Friday, 2 April 2010

Monday: Tony joins the team today. He's a trainee research engineer and will be with us for 6 months, working on our usual prototypes as well as doing some longer-term research. We have a quick tour of Coventry, now looking and sounding lovely (Theo's finished the designs now), and the Microblogging scheduling tech spike. There's birthday cake today.

We're prepping demos of 3 major projects to people from around the ´óÏó´«Ã½ today. I've been away so I get to catch up on what the team have done over the last week or so. Thankfully, our Multi-touch table, having been out of action for the last few weeks, was repaired by R&D workshops, and delivered in time for the demo. The session goes really well - we have some good suggestions about how to take projects further and can begin working up some new research collaborations

Tuesday sees Kevin from R&D coming to take our photographs - individuals and the team. It's like school photo day (except noone gets told of for wearing "inappropriate" ear-studs.) Ant is also sitting with us for the day. We have a quick meeting to clarify initial scope and requirements for the 'Outreach' project.

We finally get the post up on Mythology, that's basically wrapped up now. Afternoon meeting about consolidating our project documentation process, it's a little ad-hoc at the moment.

Wednesday: we do some maths on our Microblogging trial. Infrastructure has processed over 40 million posts with no errors or failures - we're pretty happy with this. We have continuous deployment for the infrastructure now, which is a Good Thing. We've got problems with duplicate artists in Coventry, and we still need a name! Theo attends the weekly design meeting (for senior design people from around the ´óÏó´«Ã½) and comes back with info about the reorg happening in the wider FM&T.

Excellent feedback on the Mythology blog post, unanimously positive so far. Coventry's nearly done, just a few tweaks to go to make it solid. Typography was being talked about. We've got a plan of work for the next month; polish off Coventry, some sprints on the Microblogging project, start on an office dashboard and maybe some HTML5 prototypes.

Thursday I have a telco with my boss about an interesting new possible project, then check in with the usual unfunny April Fools on the Web. Caitlin from ´óÏó´«Ã½ Strategy comes in to see our recent work and discuss how we can help prototype strategic projects. Tris and I have a meeting to work through a paper he's written about new themes for us.

A colleague from R&D brings an academic to see us, to work up a EU pitch we might collaborate on - exciting stuff. I book train tickets for a ´óÏó´«Ã½ Manchester visit - I'll be interviewing trainees and seeing the new Lab Adrian has put together up there. We end the day in the local pub where seemingly all of our old department has turned up - great to catch up with people I haven't seen for a while and hear gossip and rumours about what's going on.

Friday is a short day involving a telco, this doc and that's it. Glad it's the weekend.

Surround Video Shoot in Blackpool

Post categories: ,Ìý

Anthony Churnside Anthony Churnside | 13:00 UK time, Thursday, 1 April 2010

After successfully demonstrating a surround video set-up to an impressed public at Maker Faire in Newcastle the previous week, a contingent comprising of Max Leonard, Alice Whittle, Matt Shotton and me set off to Blackpool from the North Lab with the aim of filming a short piece in surround video with Ambisonics audio with Inside Out North West presenter Jacey Normand. In addition to collecting some additional footage for demonstration purposes, our aim was to highlight some of the potential issues a production crew might face when implementing the technology in a complex location shoot.


The concept of surround video is to enhance a normal television experience by adding extra contextual visual information all around the television in a typical living room. Graham Thomas explains more about it in his video here. The capturing rig consists of two cameras mounted next to each other, one with a fish-eye lens with a viewing angle of almost 180 degrees.


maxandme.JPG

Max and me setting up the surround video rig at Blackpool Tower Ballroom

One of the biggest issues faced by this prototype system its lengthy and exacting calibration procedure required before playback. The throw, orientation, keystone, size and position of the projected image relative to the television must be set before viewing can begin. Currently, there is no layer in the software which can re-factor the surrounding projection based on centre image source which means that zoom for both cameras must remain fixed for the duration of the entire shoot. The vast available space in our first location - the magnificent , afforded us quite a luxury in side stepping the above problem by framing our shots by varying the proximity of the subject to the lens, rather than by zooming. This worked up to a point, but some of our ideas for more distant shots (especially those on the upper balconies) were curbed by the reach of the lenses which had already been set at a compromise for all the shots we intended to take.


The lavish decoration and marvellous acoustic of the ball room provided us with the perfect footage to demonstrate the merits of being immersed in combined surround video and ambisonics, and after being treated to the sound of master organist Phil Kelsall on the hundred year old Wurlitzer, we moved outdoors for some exterior shots to finish Jacey's pieces to camera. 

Bidding farewell to Jacey, we moved on down the front towards the Pleasure Beach to meet veteran NWT camera man, Andy Cooke. Surround video lends itself very well to point of view motion and it is for this reason that the initial test footage was shot on the Docklands Light Railway in London's east end. We decided to take this one step further: Standing at 72m above the water front, at Blackpool's Pleasure Beach dominates the southern facing skyline. Andy had previous experience in filming The Big One when it opened some 16 years ago, and between us, we hoped we could come up with a safe and practical way of mounting our camera system to the front of the roller-coaster and getting back some usable footage without it plummeting 200 feet to the ground below. I've never been nervous of roller-coasters but it would be fair to say the most nerve wracking roller-coaster experience was stood watching the cars go around the track hoping the surround video rig would stay attached to the front car. Thankfully Andy's experience of rigging cameras to Roller-coasters paid off; the equipment survived and we got enough footage include in the demonstration.



offitgoes.JPG
The nail-biting moment the surround video rig was sent off around The Big One for the first run

In its current state the prototype system needs the relative position of the two videos (the centre picture and surround projection) to remain constant. This means for the whole shoot the two cameras must remain aligned with no relative changes to tilt, pan or zoom. Due to the massive forces exerted by The Big One, one of the cameras moved part way through the ride. We were able to correct for this in post production by rotating and cropping the centre shots of the roller-coaster and digitally zooming the rest of the centre video. Thankfully we were careful not to frame any of the Ballroom sequence too tightly so that the digital zoom didn't loose anything important off the edges of the TV.


The day's shoot went really well. Not only did we collect some useful demonstration footage but, more importantly, we now have a greater understanding of the problems extra development of the system might solve, and the considerations required by a crew wanting to film something with this technology.


And: Many thanks to the staff at Blackpool Tower and The Pleasure Beach for all their help and generosity.

More from this blog...

´óÏó´«Ã½ iD

´óÏó´«Ã½ navigation

´óÏó´«Ã½ © 2014 The ´óÏó´«Ã½ is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.