´óÏó´«Ã½

´óÏó´«Ã½ R&D at the Great Exhibition of the North

The first trial of our new broadcasting platform at the biggest event in England during 2018.

Published: 21 June 2018
  • Chris Northwood (MSc)

    Chris Northwood (MSc)

    Senior Research Technologist

Between the 21st June and 1st July, teams from ´óÏó´«Ã½ Research & Development will be working with partners from within the ´óÏó´«Ã½ and Culture UK at the at the to provide technology for live streamed performances from a number of artists. At first glance, live streaming an event might not seem the most innovative thing, after all, the ´óÏó´«Ã½ have been covering events for years including streaming them online, but for GET North we’ll be using R&D’s Trial Platform to produce this content.

At ´óÏó´«Ã½ R&D we’ve been asking ourselves the question of what the next generation technical platform to support broadcasting will look like. Through , we’ve been looking at how we can use Internet technologies to move broadcasting infrastructures from bespoke, SDI-based systems to ones based on streaming of bits over IP networks. Going beyond this, we’ve looked at how we manage broadcasting infrastructure within a completely IT based system, and have a built our prototype IP Studio software to test the concepts that feed into the that are being published via AMWA’s Networked Media Incubator.

We think that moving broadcasting to software allows us to radically change the way production workflows look, allowing for greater flexibility and more efficient ways of working, eventually leading to enabling object-based media to allow us to deliver new experiences for our audiences. By providing a platform as a set of APIs and capabilities, we can build new tools and workflows on top of that platform. We’ve built many APIs, capabilities and tools, which we have previously used with trials in partnership with ´óÏó´«Ã½ teams as part of our living lab at the Edinburgh Festivals, but each one of those has been a bespoke deployment for that event. What we have been building since then is an instantiation of this platform, with a supported set of capabilities and tools to support low-cost remote live production. This incorporates SOMA and capabilities of IP Studio including and dynamic media composition. I’ve previously written about how we intend to use devops techniques to build a site reliability engineering team to build and support this platform, and our trial at GET North is the first production use of this platform with this automation.

What we have built is a set of bare metal machines running which can be automatically provisioned and configured using industry standard tools PXEBOOT and . This is on top of a high-performance networking layer that we can also control using our Ansible automation suite. In future, we aim to extend this with cloud computing systems, and our Cloud-Fit Production project is investigating how to best utilise the cloud in production environments.

Our portable rack for #GetNorth2018 trials with @´óÏó´«Ã½RD. I've blogged about what we're up to!

— Chris Northwood (@cnorthwood) June 22, 2018

Some of these machines are based in our MediaCityUK datacentre, but a number of these are also loaded into portable flight cases which we can take to the remote production site. SOMA is designed to work with unoperated cameras, with static locked-off UHD cameras used to give a wide shot, where crops can be taken out of these shots to create virtual cameras, giving more choice to the SOMA operator than the locked-off cameras alone give. The output of the system is ultimately 1080p HD footage, so no loss in quality is introduced by using crops.

For our trials at Edinburgh, we used dark fibre connecting back to R&D’s own network at 100Gbit/s, which allowed us to experiment with moving raw video around a remote production site, but this approach is not always available at a reasonable cost to all sites. With the Trial Platform, we want to be able to use commercially available fibre Internet connections, where providers offering up to 1 Gbit/s are becoming increasingly available at reasonable cost. As a result of this, the machines we take on site record the content, and only low-resolution proxies are streamed at low latency to the SOMA interface. The machines then store a higher quality (but still compressed) version at full resolution. The high quality output is then rendered from machines at our MediaCityUK datacentre, but this render is delayed by a number of seconds. The render then only needs to fetch the content from the store that is currently being used in the rendered output, and by delaying by a number of seconds, gives the render time to warm up a buffer by looking in advance of any edit decisions made to smoothly switch to the remote stream. This allows us to run events from places with relatively low Internet speeds, but still with UHD broadcast quality content.

Once rigged, this system can be left unattended and a remote operator can use the SOMA interface by connecting over the public Internet, meaning they can be based at a ´óÏó´«Ã½ base, or even work from home. The bandwidth requirements into the SOMA interface are relatively low, and a home wi-fi connection is all that is needed to control the system. Due to the automation we have in place, our support engineers can also work remotely, limiting the amount of staff that needs to be on-site.

You will be able to watch the opening night on ´óÏó´«Ã½ Arts live page, with the following nights on . My R&D colleague Jasmine Cox will also be at GET North as a mentor for artists in residence, and you can see more information on .

R&D’s presence at the Great Exhibition of the North is part of the Culture UK initiative, which partners the ´óÏó´«Ã½ with the arts and cultural sector all over the UK enabling them to take advantage of R&D’s technology on the Trial Platform, including low-cost live streaming.

-

´óÏó´«Ã½ R&D - High Speed Networking: Open Sourcing our Kernel Bypass Work

´óÏó´«Ã½ R&D - Beyond Streams and Files - Storing Frames in the Cloud

´óÏó´«Ã½ R&D - IP Studio

´óÏó´«Ã½ R&D - IP Studio: Lightweight Live

´óÏó´«Ã½ R&D - IP Studio: 2017 in Review - 2016 in Review

´óÏó´«Ã½ R&D - IP Studio Update: Partners and Video Production in the Cloud

´óÏó´«Ã½ R&D - Running an IP Studio

´óÏó´«Ã½ R&D - Building a Live Television Video Mixing Application for the Browser

´óÏó´«Ã½ R&D - Nearly Live Production

´óÏó´«Ã½ R&D - Discovery and Registration in IP Studio

´óÏó´«Ã½ R&D - Media Synchronisation in the IP Studio

´óÏó´«Ã½ R&D - Industry Workshop on Professional Networked Media

´óÏó´«Ã½ R&D - The IP Studio

´óÏó´«Ã½ R&D - IP Studio at the UK Network Operators Forum

´óÏó´«Ã½ R&D - Industry Workshop on Professional Networked Media

´óÏó´«Ã½ R&D - Covering the Glasgow 2014 Commonwealth Games using IP Studio

´óÏó´«Ã½ R&D - Investigating the IP future for ´óÏó´«Ã½ Northern Ireland

Rebuild Page

The page will automatically reload. You may need to reload again if the build takes longer than expected.

Useful links

Theme toggler

Select a theme and theme mode and click "Load theme" to load in your theme combination.

Theme:
Theme Mode: