大象传媒

Explore the Edinburgh Festivals using 360 video and WebVR

A new interactive 360 video experience on Taster that gives a flavour of the Edinburgh Festivals

Published: 21 August 2017
  • Alia Sheikh

    Alia Sheikh

    Senior Development Producer
  • Jake Patterson

    Industrial Trainee
  • Graham Thomas (MA PhD CEng FIET)

    Graham Thomas (MA PhD CEng FIET)

    Head of Applied Research, Production

We鈥檝e just released a new web-based VR experience on Taster, that lets you explore last year鈥檚 Edinburgh Festivals, using a collection of 360 videos. In addition to finding out what users think about this way of exploring events like this, we wanted to see what could be achieved using open WebVR technology, rather than requiring users to download a dedicated proprietary VR app, which is often the approach taken for interactive VR applications.

We have been using the Edinburgh Festivals as a testing ground for technology for some time, as a good example of an event that the 大象传媒 covers that encompasses many kinds of live activities over a large area, with lots of involvement from both the 大象传媒 and the general public.

VR is known for being good at giving a sense of place and allowing users to explore, so last year we gathered a lot of 360 video, with the aim of experimenting with ways of letting users explore and experience the Festivals. We have now created an interactive experience that uses some of these videos, and .

As well as helping us find out about what users think of using VR to explore Edinburgh in this way, we wanted to experiment to see what was possible using , as this is an open standard and doesn鈥檛 require users to download a specific app. There are already examples of VR applications on 大象传媒 Taster that require an app to be downloaded, including one that looks at the , but WebVR has the potential to offer immersive and interactive experiences using their existing web browser. It generally needs an up-to-date browser and a modern computer/tablet/phone to run smoothly, but can support interactivity (not just simply playing one video from start to end). We have structured our experience as a kind of branching narrative, where each video includes interactive overlays that let the user choose what to see next, which would not be possible using video playback platforms such as YouTube.

The application is built on top of - a library that provides a base for Virtual Reality experiences (3D coordinates, geometry, and stereoscopy for Google Cardboard, amongst other things). We've targeted a number of devices for this project: Desktop, Mobile (Android/iOS), Google Cardboard and GearVR. A-Frame has helped massively in that we haven't had to write specific code for each device, however it became apparent that we'd have to adapt the experience for playback in headsets.

When the viewer has a mouse at hand, or has a touch-screen, it makes sense to use that as the interaction mechanism. However, in a headset, the only way of interacting with the application is via head movement.

For headsets, an interaction technique we have used in the past is the gaze-based cursor. A cursor appears in the centre of the view and is used to select elements within the scene. When the cursor moves onto an element a timer starts, and provided the cursor stays on that element, it will activate once the timer finishes. A benefit to this is that remote controls are not needed to navigate the application - meaning we don't have to write code for every headset on the market.

We've made sure our code base isn't bespoke for this application - the core application is separate to the structure and content for the Edinburgh Festivals experience. An experience is made up of its assets (videos, text etc.) and a JSON description - essentially a recipe telling the application how to build the experience from the assets. In the description, scenes are broken down into computer-readable objects. Each object tells our application which video relates to that scene, which navigation elements should appear, and any events that happen in that scene. We have a number of events built into the system. For most uses, there will be a 'play' event at the end of each scene, indicating the next scene to play. Each scene in Edinburgh references itself in this event, to create a looped video.

Please and let us know what you think via Taster鈥檚 ratings system. This will give us useful insights into what WebVR on current platforms is capable of, and how users react, and will inform future work on tools to build interactive applications using WebVR, so we can offer viewers compelling immersive experiences without them having to download and install proprietary applications.

We鈥檇 like to thank Jack Kibble-White for his work on the editorial side of the project, Sam Nicholson  who helped with much of the shooting, Stephen Anderson for editing and stitching, Paul Golds for help with web development, Cat Bell for her timely assistance on the admin side of the project, Matthew Scarth for pulling together the 大象传媒 Taster launch and the performers at the Edinburgh Festivals who worked closely with us to create 360 experiences of their acts.

  • Immersive and Interactive Content section

    IIC section is a group of around 25 researchers, investigating ways of capturing and creating new kinds of audio-visual content, with a particular focus on immersion and interactivity.

Rebuild Page

The page will automatically reload. You may need to reload again if the build takes longer than expected.

Useful links

Theme toggler

Select a theme and theme mode and click "Load theme" to load in your theme combination.

Theme:
Theme Mode: