´óÏó´«Ã½

« Previous | Main | Next »

Little Sun - ´óÏó´«Ã½ R&D create participatory art experience at Tate Modern

Post categories: ,Ìý,Ìý,Ìý

Matthew Brooks | 14:00 UK time, Wednesday, 8 August 2012

Ìý

´óÏó´«Ã½ R&D have joined forces with Studio Olafur Eliasson to create a participatory art experience at the Tate Modern. In 18 days, we created an installation and launched a website - - bringing thousands of artworks created by the public to anyone with a modern web browser. In this blog post, I’ll give you a rundown of how we managed to do this in just under three weeks.

Olafur Eliasson's recent project is concerned with bringing light to the 1.6 billion people in the world who don’t have access to electricity. To engage people with the importance of access to light, Olafur wanted to create an installation in which people could create . Not only that, he wanted all of those light paintings to create a sun that could be explored on the internet. There would be a blacked out room at the Tate in which to create the light graffiti, and a live feed of the creation process, and an online digital sun allowing the light graffiti to be explored. Olafur explains his concept much more eloquently than I do in this .

Our Chief Scientist Brandon Butterworth had been in conversation with Olafur about the project for some time. Olafur had created in the Tate Modern’s Turbine Hall in 2003, an artwork I remember being fascinated by. When Brandon sent an email around asking for volunteers to sign up for a project involving Olafur, I jumped at it, as did several other brave and foolhardy R&D engineers.

After an initial chat with Olafur and his studio, we talked about the challenges that were presented to us. How would we create the light graffiti photographs themselves, and how would we make an online light graffiti sun out of hundreds or even thousands of them?

We considered using part of a to create a pulsating sun from several video feeds, but this approach wouldn't scale up to the number of light graffiti Olafur was talking about. There's a limit to how many videos can be played back simultaneously, especially over the web.

To create photographic light graffiti, a fair amount of experimentation is required to balance long exposure times with the amount of light entering the camera lens. We didn’t know how long people would paint for, and we needed an installation that the Tate staff could operate without being expert photographers. So we decided to fake long exposure photography in software to give us the control we needed.

Chris Pike started experimenting with thresholding video in , a creative C++ library. Initially, for each video frame, pixels with a luminance about a threshold level were added to a long exposure picture buffer, as you can see from Chris's early screen grab.

Chris Pike Makes Progress

The thresholding hasn’t captured any of Chris’s movement as he was writing the word ‘Progress’ with his torch - only the light portions of the frame are retained. However, these portions were very hard-edged, a problem Chris subsequently solved by using the maximum value between old and new pixels to achieve smoother blending.

Alongside the long exposure photograph work, we had to create the sun. We couldn’t compose it from the videos we were using to create the long exposure photographs because of bandwidth issues mentioned previously. We briefly considered whether we could compose it from just the resulting photographs - but a back of an envelope calculation still showed that, for the quantity of acceptable quality images we were talking about, bandwidth would still be an issue. There were also aesthetic issues with static photographs - the sun isn’t static. It’s alive, it’s dynamic, it breathes, and this is something I thought was a very important part of any potential visualisation.

As it transpired, Daniel Massey from Olafur’s studio had code up and running in , another creative C++ library. He'd been tracking light sources for a different project, and his code was already pumping out position, size and time data for multiple light sources. We could use this to efficiently represent an animated version of the light painting.

Ìý

Meanwhile, I’d been busy learning enough to render a sun in the browser. Matt Shotton had created some fake drawing data for me with a squiggle generator he’d written in , and I’d managed to get a visual prototype up and running.

Early Sun Visual Prototype

It was fairly crude, it didn’t have much life, but it looked sun-like enough for me to pursue. If we joined it up with Daniel’s tracking data, we would be able to draw and animate hundreds of light paintings simultaneously. We decided that it was the approach that would most likely give results in time for the opening at the Tate, which by this time was about a week away.

Matt got busy writing a server in (he does like his Python), hooked up to a database to store both the point data (in JSON form) and the long exposure photographs, while I worked on turning Daniel’s tracking data into 3D geometry I could use on the sun’s surface. I wanted the sun to appear alive, but knew JavaScript was not going to be up to the job of moving hundreds of thousands of vertices around. The geometry needed to be static, and the animation needed to be done on the GPU. Several hours of head scratching and shader writing later, I had written my first simple vertex shader, moving and scaling 3D geometry, and I’d converted the time information to texture coordinates, allowing me to scroll a colour ramp through the geometry, playing back the drawing process captured in the blackout room in the Tate. My line draw was glitchy, and my quaternion rotations weren’t working, but luckily Richard Taylor was able to step in and sort my hurried maths out. I realised then we shouldÌý thin out the tracking data, so Matt wrote something to collapse points that were close to each other, and remove points that were creating almost straight lines. Less polygons, faster performance. A bit more work and the sun was navigable, exploding out to reveal individual drawings, and contracting back to a dense sun.

Operator Console in the Capture Room

While this was going on, Chris Baume and David Lewis were building an interface for the Tate operators, allowing them to start and stop the capture process, pumping tracking data and long exposure images into our database, and producing a unique ID that would be given to participants in the Tate, which they could use on the website to see both the animated 3D geometry and the long exposure photograph they’d created.

Meanwhile Alia Sheikh, our project manager, was extremely busy making sure we had an installation to deploy our software to. Brandon was assuring us we had servers. We were in week two now, but it was starting to look like we might have something viable.

Daniel’s tracking work was in Cinder. Chris’s long exposure work was in openFrameworks. Something had to give, and it was Chris - he ported his work to Cinder, meaning we had one app providing both vector and photo data. Chris also moved some of Daniel's computer vision processing onto the graphics card, allowing real-time creation of light graffiti using an HD camera.

By this time, the equipment Alia had ordered was arriving at the Tate, and was frantically being assembled by Mark Viltan, David Lewis and Julien Pansiot. Once set up, we had a light graffiti installation in the Tate and it was giving great results.

Ìý

It worked.

Alia's installation diagram (one of many!)

I travelled down from Manchester to London the day before the installation was due to open. Alia and I wanted to prove everything was working with an end to end test - we needed to see someone’s light graffiti painting make it from the capture rig in the blackout room as a live feed to the Tate concourse, and into the database as vector data and image, and from there back out onto the concourse as part of the sun visualisation - the sun visualisation polls the database to check for new images. We got there sometime on Friday evening, at the cost of me improving the Sun visuals, which is what I’d desperately wanted to do. So Saturday for me was spent cross-legged on the Tate floor behind the installation, pushing live updates to it whilst no-one was looking. I wanted it to look as good as possible for the opening night.

The launch went smoothly - no last minute adjustments required, unlike the last demo I worked on, which I rebooted just as Her Majesty The Queen rounded the corner to see it - a successful fix, but an incident the Controller of ´óÏó´«Ã½ R&D has still not quite forgiven me for. 500 participants spent two hours exploring the Tate’s surrealist gallery with Little Sun torches, and there were long queues for the blackout room. People bought the idea, they wanted to paint, and we’d managed to get from zero to installation in two weeks flat.

My desk for Friday and Saturday? On the floor, behind the installation.

Ìý

Olafur opens the installation

Queueing to make light graffiti

Guests

We started Week Three without a public facing website. We were kind of done, but we had to shore up and fix bugs before opening to the wider world. Joel Merrick was on board by this point, putting the CherryPy server behind Apache, and installing a cache to reduce server load. I was very glad to have a man on board with experience in launching websites. Matt and I were responsible for a soon to be public website, but neither of us had ever launched one. Daniel was hard at work building the web framework around the visualisation.

Ìý

A sun full of light graffiti

I was making last minute visual tweaks, one of which was mightily simple but hugely effective - pinch a load of colours from some completed light paintings and use those to build the colour ramp for the animation. Suddenly, I had a more dramatic, moodier looking sun, and because there was only a blip of white in the colour ramp, it looked far more dynamic and painterly, and matched much better with the long exposure photos. The ball was in Daniel's park now, although there was still a bit of work needed from me, as we had to resolve the unique IDs being handed out to Tate visitors, which would get them back to their individual work of art on the online sun. Daniel eventually launched the site on Saturday 4th August, 18 days after the first lines of code were written.

I hope you'll check out and have fun exploring it. I'm mesmerised by the end results - it's like watching a fire. Some drawings are more notable for their movement than their photograph, and some vice-versa. The installation is running until 23rd September 2012, so if you can, go along to the Tate Modern, be a part of this digital artwork, and share the artwork you created using the link given to you by the Tate staff.

I'll finish off with one of - the artist using our realisation of his work of art to create a new work of art with one of his other works of art...

Ìý

Olafur paints with light

Ìý

Ìý

Comments

Ìý

More from this blog...

´óÏó´«Ã½ iD

´óÏó´«Ã½ navigation

´óÏó´«Ã½ © 2014 The ´óÏó´«Ã½ is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.