Six Months in the Life of the Radio & Music Product
Team Eno's morning SCRUM meeting
I joined ´óÏó´«Ã½ Future Media in January 2011 as Head of Radio and Music with the job of defining and delivering the new Radio and Music product as part of ´óÏó´«Ã½ Online's 10 products, 4 screens 1 service strategy.
I manage around thirty people which breaks down into:
- Product Management (decide what we build)
- Project Management (decide when we build it)
- Development (decide how we build it and actually build it)
I was extremely lucky to inherit teams of very smart and passionate people with a deep understanding of the portfolio of sites that make up the product, and of the infrastructure which supports that. I am also very fortunate to have a great partnership with the fantastic editorial team in Audio and Music under Mark Friend, who work directly with the Radio Network staff and help us to understand their priorities.
On Friday 4th May at the next ´óÏó´«Ã½ Online Briefing Mark and I will be presenting an update on the Radio & Music product. So this seemed like a good time to give you an in depth look at what we've been doing in the past six months.
As some of you will know each of the ´óÏó´«Ã½'s radio networks have long had their own websites. These have evolved organically over the years to support the needs of their very different audiences, and behind those sites there is a complex set of systems which manage the metadata, schedule, encoding, rights, messages about what is now playing and so on, all based on a trail of technologies which reflect the evolution of the Internet.
My goal was to work out what to replace and how so that we could deliver an even better experience to the audience and simplify the operational work to reduce costs.
A large part of the challenge was how to build something that works equally well for Radio 1 and Radio 4 listeners without becoming the lowest common denominator. Another part of the challenge was to make it easier for features developed for one network to be available for another.
What do we want?
After a series of facilitated brainstorming sessions with Product Management, Editorial and Technical teams we had produced a list of over 250 major features ("epics") based around three big bets of "Live", "Audio Discovery" and "Music".
At that point we had a pretty good idea of where we wanted to get to, along with some idea of the user experience from our UX colleagues. Next we took a swing at how long it would take us. We used the model of tee shirt sizes to get a sense of the size for how long eachÌý"epic"Ìýmight take.
When do we want it?
Based on the fact that the Olympics was an interesting milestone and about 12 months out, we decided to go for a series of releases, one everyÌýthree months.
This was just enough of a "left to right" plan to get going knowing that:
- no plan survives first contact with the enemy
- we had only high level estimates on the huge number of epics we had pulled together
- we had no track record on which to base our estimates of velocity anyway
- the end date was over the horizon
So - let's get started!
Now the question became "where?"
It turned out after reviewing our portfolio of sites that www.bbc.co.uk/radio/ was a pretty good candidate for our first deployment.
This is a site which had been around for years, is linked to from the global navigation, receives a reasonable amount of traffic (400k-450k unique browsers per week) but had a very contained set of user journeys ("I want to listen live to Radio 2", "Get me to the Radio 4 homepage"), and, as luck would have it, was in need of some technical work anyway.
And then how?
We did a quick analysis of the primary user journeys and agreed with the stakeholders a Minimum Viable Product (MVP) which would allow us to launch so that we could then inspect and adapt based on empirical data. After all, ; it's what you think that matters.
Our stakeholders have been great at understanding and embracing the approach. It is never easy, but especially in an organisation which is used to "shipping" a finished radio programme only when it is perfect, and so I've been very impressed with the way they have adapted to this crazy agile thing.
We were a tiny bit nervous about upsetting some users, so we started with an idea that we would do some A/B testing on the new site, but given how radically different the new product designs were from the existing site we quickly changed our minds and opted for a beta launch instead.
We'll come back to the A/B testing again on slightly more subtle decisions.
How we organised our teams
We also had to determine how best to assign the people in the team (we agreed very early that I wouldn't use the word "resource") to balance the workload.
There was clearly:
- a lot of new development that needed to be done;
- also nearly five million unique browsers a week visit our pages, so they need to be supported and maintained;
- also the brilliantly creative teams in the networks keep coming up with ideas of how to engage with our audiences online;
Oh and I didn't have infinite headcount (who knew?).
We took a guess at how much support Business As Usual (BAU) would need and then started to roll people over from their existing work to the new product.
As we did this we evolved a plan that involves a set of virtual scrum teams divided between New Product development and Business As Usual. This allows us to create cross functional scrum teams and move people between the teams without having to change their managers.
We let the teams chose their names and they've come up with "Bowie", "Eno", "Propellerheads" and "Gaga" - no prizes for guessing the theme - and we have since added "Moby".
The New Product teams (Bowie, Eno and Moby) operate on 3 week sprints; Gaga have just started tracking workflow with , it being a better way of handling the business as usual and short order work that they manage.
Teams and Sprints both have names
So we now had something to shoot for and we quickly started creating the detailed user stories and designs.
The teams had already been operating in a but had been delivering across a wide range of short term projects and now we were moving to working on a single product with a 12 month roadmap so much of the process was new.
As part of this transition we wanted to build in a principle of "Delivering Predictable Quality" so that we could demonstrate to all our stakeholders that we would keep our promises both for time and quality. Those of you familiar with running projects won't be surprised to know that we needed to flex the scope in order to do this, and we've had great support from our stakeholders in managing this.
Because the teams were going to be learning to develop on a new platform (the ´óÏó´«Ã½'s Platform sometimes known as "Forge"), we wanted to start off steady, and to give ourselves the best possible chance of success. We decided we needed an "input pack" for each sprint which contained the prioritised list of user stories(groomed backlog) and the detailed designs with notes on how the designs were to respond to user actions (annotated wireframes and visual designs).
We had a bit of trouble initially deciding where in the cycle the acceptance criteria should be created, either by the dev team in the sprint, or as part of the input pack to the sprint, but have settled on the business acceptance criteria being defined by the product management team in the input pack, and the QA folks writing the technical acceptance criteria (in the form of Cucumber tests) at the start of the sprint.
This model has served us reasonably well for the last six months, although as with everything we do it has evolved, and will continue to evolve, based on how well it is doing.
In order to keep the input pack going we created the "Propellerheads" team, who are responsible for defining the work that the development teams will do in the next sprint and beyond.
This team is a combination of UX, product management, business analysis and technical leads who turn the un-groomed backlog (the wishlist) into something that the dev teams can actually get stuck into, make reasonable estimates on, and get good at delivering on those estimates.
As with all new things you've got to give it a few iterations to get good enough to work out whether you need to change the process or just get better.
We've been pretty pragmatic about this and the teams have taken responsibility for making that judgement of when they want things to change.
Architecture
Yes we did have one, we didn't just make it (all) up as we went along!
Radio and Music Product Release 1.2 Architecture
We knew from the start that we would be building with an , and with some enthusiastic supporters of on the team we decided to go for a single application for each major feature area, each serving the minimum number of views for each URL, each of which would behave appropriately on the platform in question.
The four platforms we support are Desktop, Tablet, Mobile and Connected Devices (TVs and Hybrid Radios).
We needed to focus our attention on where the opportunity was greatest so went for Desktop, followed by Mobile, then Tablet and lastly connected devices. We analysed the key user needs for the different platforms and decided to serve one view for both desktop and tablet, and use device detection to serve a single view for all mobiles irrespective of form factor.
On Mobile the big question was whether to make a dynamic WebApp or installable Native App or a Hybrid. We agreed that a WebApp would be a good place to start, and that we would plan to follow up with a Native App with a richer set of features that couldn't be supported via WebApp.
Code Reviews
So then we started looking at the development approach. One of the first things I read when I joined was feedback from one developer who had not had his code reviewed in three years. That was a surprise but given the work the teams had been doing I could see how it had happened.
No more.
We kicked off a code review process where no code is checked in unless it has been reviewed, either by pair programming or a separate code review.
There are plenty of people who can wax eloquent on the merits of code review but for me there are two things: better code and better developers. Fewer bugs on each check-in, and the developers learn from each other.
All good.
Anyway, we are sticking with that and also employing other good engineering practices to make sure we continue to deliver predictable quality.
So how have we done so far?
I'm really proud of the way the team have responded to this challenge. I read a post on where heÌý, an image which really struck home when I think about the way that the team have created something that simply wasn't there before.
We've shipped to live every sprint since we first went out on beta in August. (That's everyÌýthree weeks rather than everyÌýthree months as I had originally suggested, so what did I know?)
We've tried to make sure that these changes are progressive rather than disrupt the audience with a pseudo random sequence of changes. It is always difficult to introduce change but so far the signs are positive, we've had lots of really great feedback, some positive and some constructive, and we are reading all of those comments as they come in, as well as looking at the empirical data to understand how our work is being received.
Our last release included some features we've been working on for a while, such as the ability to customise your preset stations on desktop as well as mobile.
We are using ´óÏó´«Ã½ iD to store this information server side (rather than in a cookie) so that it is available on other devices, and we will continue to expand the number of interactions that involve iD.
We strongly believe that the more personalised the service the more valuable it is: whether you are loving tracks on Radio 1 or saving an interesting Radio 4 show to listen to later, all of these things should follow you seamlessly (ok, I'm sorry, I had to get the "s" word in) across devices.
I look forward more improvements with each release of the product. For example, your choice of stations will soon follow you to your mobile phone.
Chris Kimber blogged in February about a significant release for desktop and mobile;Ìýand I look forward to hearing what you have to say in the comments.
Andrew Scott is the Head of Radio and Music Product, ´óÏó´«Ã½ Future Media
Comment number 1.
At 26th Apr 2012, Russ wrote:Here are ten long-standing questions whose issues seem to have inadvertently eluded mention in the above blog:
1 Does the recent announcement that radio will be included in the XBox 360 iPlayer implementation constitute an about-turn on previous announcements that radio will disappear from iPlayer later this year?
2 What progress is there to report on radio stations being able to share programme information?
3 Are you planning to get rid of station programme schedule pages? Or are you planning to disconnect station programme schedule pages from iPlayer feed data, and if so, when?
4 When will the pointless practice of re-recording pre-recorded radio programme files cease?
5 When will the radio pages receive their GEL-conformant page widths?
6 What progress has been made regarding the commitment to present radio content differently from the current carousels?
7 Has there been any progress in getting radio programmes into the latest programme carousels concurrently with their appearance in iPlayer?
8 What is the timetable for the radio product be able to show for a logged-in user the last programme(s) listened to and whether whether a favourited programme has been listened to?
9 Have you yet solved the problem of programme series disappearing from the programme carousels while a new series episode is undergoing its wait until it appears in the programme carousel?
10 Are you planning to continue listings for 'featured' and 'latest' programmes where the content of such listings is identical?
Russ
Complain about this comment (Comment number 1)
Comment number 2.
At 27th Apr 2012, Keith wrote:I quite like the layout that's beginning to appear on the revamped local schedule pages such as /cambridgeshire/programmes/schedules . It would be useful if the top-tier navigation on those pages could include links to the programme a-z. Also the home link on such pages (including the new /radio/stations/cambridgeshire ) are incorrectly linking to a URL which redirects to the local news site.
Hopefully the navigation on the /radio site will be amended so that it complies with the GEL layout, in that it appears below the section title.
Complain about this comment (Comment number 2)
Comment number 3.
At 1st May 2012, sean wrote:This comment was removed because the moderators found it broke the house rules. Explain.
Complain about this comment (Comment number 3)
Comment number 4.
At 23rd May 2012, Andrew Scott wrote:Thanks for the comments - here are my (delayed) responses:
@Russ
1 It remains our intention that as the current version of iPlayer (v3) rolls out across more devices that it would include Radio. The fact that it wasn’t in the original release on the XBox 360 was simply a question of sequencing.
2 The backend work to enable this is progressing well, however the visible results of that will take some time to filter through.
3 We have no intention of getting rid of station programme schedule pages. These are very popular on some networks, and we have provided a clear link to these in the navigation bar toward the top of each page.
4 We currently have a single process to generating recordings of transmitted content which means that we don’t need to have separate flows dependent on whether the content has been pre-recorded or (as is the case with the majority of our content) transmitted live. This will be subject to review as and when we look at replacing this system.
5 All of the radio pages are in the process of being brought together into the radio and music product. This isn’t happening in one go so some of the pages will be updated sooner than others. was the first page which we updated, we are currently working on the next pages and expect to share our progress this summer, probably with a public beta to start gathering feedback.
6 As above, we are expecting to share our progress this summer.
7 We believe that we have fixed this feeds issue now so there should not now be a disparity between iplayer and our latest programmes carousel.
8 These are certainly good features, and ones we have in our backlog but they haven’t got to the top of the priority stack yet. We develop iteratively and release pretty much every three weeks so we will get to these as they get to the top of the stack. We don’t commit timetables for specific features.
9 The programmes carousel shows the latest or most popular programme episodes (not series) available. As encoding of a programme takes a period of time, there can be a short amount of time when the expired episode has been removed as it is no longer available, but the latest episode has not yet become available.
10 While we do believe that it is appropriate to remove duplication within a view we don’t believe it is appropriate to remove duplication across different views. These views represent different user needs, either to find something interesting from the featured%2
Complain about this comment (Comment number 4)
Comment number 5.
At 7th Jun 2012, Russ wrote:Andrew - I am delighted to hear you have no current plans to remove Radio from iPlayer v3. It does beg the question of what may be included in v4, but no doubt the ´óÏó´«Ã½ will make a preliminary consultative statement on that matter in due course.
I am also pleased to hear that the process of re-recording pre-recorded content will be subject to future review. Btw, I believe you have more than "a single process" here, since I understand podcasts, now increasing rapidly in number, are put in an 'advance publishing file': these podcasts are often (invariably?) available long before the 're-recorded versions' appear in the iPlayer system. Would it not save you a lot of time if the podcasts carried some sort of iPlayer metadata tag that both precluded the need for re-recording and allowed their instant availability in the normal playback avenues?
Russ
Complain about this comment (Comment number 5)
Comment number 6.
At 14th Jun 2012, Danny Ghezi wrote:All this user's posts have been removed.Why?
Complain about this comment (Comment number 6)