My project, Swarm, is an interactive audio-visual installation, exploring the concept of the hive mind and swarm intelligence. It is presented as a visualisation of several hundred entities, projected on a touch-sensitive projection screen, sonified with generative audio. Visitors to the installation are invited to touch the screen, which affects the behaviour of the swarm visualisation in different ways depending on where and how much of the screen is touched.
When there is no interaction with the swarm, entities gently ebb and flow across the screen, flocking together with their neighbours. The accompanying audio is designed to complement the visuals, initially comprising of smooth slowly shifting textures. As soon as the screen is touched the hive is disturbed; swarm entities will avoid the area being touched, shifting their colour and speeding away when they encounter the disturbance. Each disturbance is individually sonified, the characteristics of the sound linked to the size and position of the area touched. The larger a disturbance is, the faster the sounds and the higher their volume. The sound’s position in the stereo panorama directly controlled by its position across the width of the screen.
If there are any large disturbances, or more than a handful of disturbances to the hive, it will start to become agitated, entities increasing in velocity and fading their hue to red. If the hive continues to be disturbed, it will swarm, all entities turning entirely red and moving with increased speed and momentum. The entities change their attitude toward disturbances, swarming around them instead of their normal evasive behaviour. This shift towards aggressive behaviour is reflected in the audio, where the bed of soft, flowing sounds are changed for sharp, coarse sounds, showing that the hive is angry. The entities will swarm until they are no longer disturbed, taking several seconds to cool down before returning to their docile state.
The audio is generated from constant automatic recordings of the gallery space, automatically ingested into time-stretchers, whose playback speed, volume, transposition and panning are controlled by the hive’s disturbances and aggression level. The audio generation system is continually refreshed with new recordings from the space, which often contain audio that was generated and output by the installation previously.
While I have been faithful to the core concept laid out in my project proposal, several aspects have evolved from my original intentions. The most notable is the visual element of the piece, originally slated to be a cloud of white noise, only exhibiting hive characteristics when interacted with. I chose to develop this idea after I realised such a visualisation, while in keeping with the inspiration for the project (Nine Inch Nails’ concert visuals), would not only be visually dull, but unrepresentative of the core idea behind my entire project – hive intelligence and swarming behaviours. Changing the visualisation to flocking entities now much better communicates these concepts at all times, even when no interaction is taking place.
The most closely related work to my completed project that I’m aware of is one of Zachary Booth Simpson’s many interactive swarm-related installations, also named Swarm. In his work a projected swarm of entities are attracted to subjects’ bodies when they walk in front of the projection screen, effectively hiding behind the subject’s shadow. Simpson describes the behaviours of the swarm as either “schooling like fish or bustling like ants, depending on their mood”. This idea of attaching a “mood” to the swarm is prominently present in my work, allowing the swarm to be something more than just a simulation of flocking behaviours. Although Simpson includes an implementation of hive moods, it is certainly not as central to his installation as it is in my own. The omission of an audio counterpart and relatively simplistic visual representations of the swarm entities indeed make for a simple and effective installation which is strongly related to my work, but distinct in its execution. My work focuses on tangible interactivity and a visually pleasing representation of the swarm, coupled with audio feedback, whereas Simpson does not seem concerned with decoupling the organic nature of the swarm with its enabling technology. In its technical application, this piece of Simpson’s work is dissimilar from my own, relying on the shadows cast by the subjects to interact with the hive. This said, nine of his installations use infra-red for greatly improved accuracy in computer vision, several of which are using infra-red illumination in a very similar way to my project. Looking at his detailed diagrams and explanations now I can say it would have been extremely useful to find his website prior to embarking on my project, despite eventually arriving at the same solution.
Another project that encourages interaction with swarm emulations is Jimmy McGilchrist and Darryn van Someren‘s interactive digital media installation, also congruously named Swarm. Video cameras display the Melbourne public of Federation Square on video screens, with superimposed images of butterflies which fly toward and land upon motionless viewers until they move, whereupon the butterflies take flight and leave the video frame. This method of interaction, where the viewers see an image of themselves in the work has the benefit of being inexpensive for the artist and intuitive for the audience. It also scales extremely well with larger audiences, as has been discussed by Jeffrey E. Boyd and others in the paper Video Interaction for Swarm Art. McGilchrist and Someren’s work differs in several ways to my own, being closer to an interactive visual toy than an in-depth exploration of swarm behaviour. Aside from the sparsity of the swarm entities and the much larger audience, the principal difference is the audience see themselves in the work. They are presented with and included in an alternate reality through the medium of the screen, where they are able to interact with butterflies that only exist with them on the screen. In my piece you look into a space defined by the bounds of the projection screen, containing the swarm. The observer and the piece are separate, and the observer is aware that the swarm exists in front of them, in the same reality as they are. When the art includes the audience visually, as in McGilchrist and Someren’s piece, the audience must accept that the images they see of themselves interacting with the swarm are an illusion. It is uncommon to see yourself in the third person interacting with something, and thus I believe that using visuals of the audience in interactive pieces akin to my own detract from their realism.
Yunsil Heo and Hyunwoo Bang’s Oasis: II explores swarm behaviours without using any electronic interaction. Screens set into the floor covered in sand show a virtual pond of computer-controlled schooling life, which can only be seen where sand has been displaced. This piece has some commonality with my own, chiefly the requirement for the work to be interacted with for it to reveal itself fully. I am very much in support of this paradigm in the art world, I believe that with interaction a piece can be much more immersive and memorable than without.
I do believe the project, broadly speaking, has been a success and I am very pleased with its aesthetic outcome and robustness, reassured by the positive feedback from my peers. I can say that the project has not fallen short of any of my intentions. Aesthetically, the project has surpassed by expectations, which is in part testament to the rapid development fostered by the Processing programming language and environment.
Technically, I do feel that that elements of the project could be significantly improved. This is not to say that in their current incarnation they are not fit for purpose, the code I have written, while robust and functional, runs slowly as I am not a proficient enough programmer to know how and where to optimise the code. The visuals run at 22-24 frames per second, which is certainly fast enough to look smooth, but by today’s standards it is not fantastic, especially when you consider the system is running on a quad-core CPU. My other technical grievance is the inaccuracy of the infra-red touch detection system. Specifically, how the system cannot cope with users standing too close to the screen, as their body disrupts the IR light falling on it. The interaction system certainly works accurately enough for the installation to work as intended, but it would be unfit for any installation that required more precise control. If I was to begin the project again knowing what I do now, I would seek a different touch interaction method, such as frustrated internal reflection in transparent panes of glass or Perspex.
One of the key elements in the practical realisation of the project was turning a standard front-surface projection screen into a working touch screen. Needless to say, interactivity is central to the piece and thus a lot of time was invested in getting the touch input system as accurate and responsive as was possible. Another important consideration was the audio system, made entirely in Puredata, it had to be both extremely robust and error-tolerant, as well as being flexible and have parameters that could be easily controlled. The audio component, an afterthought in the project’s proposal, evolved into a complex system in its own right, taking the most time to complete. The keystone that linked the entire project together is the “behaviours” system, taking interaction data from the computer vision library and processing it into up to six “disturbances”, tracking their size, age, area and perimeter, and making a judgement as to the hive’s “aggression level”, a metric passed to all of the system’s components, controlling the audio and the visuals simultaneously.
The journey I have taken in planning this project and bringing it to fruition has been greatly enjoyable and informative, and I believe this is reflected in the work itself.
- Original Project Proposal (PDF, 78kB)
- Early Google Sketchup Plan of Installation (PNG, 539kB)
- Video showing touch interaction in debug view (AVI, 6.94MB)
- Flickr Gallery with photos of four prototypes and the finished installation (65 photos)
- Jimmy McGilchrist and Darryn van Someren’s Website
- Zachary Booth’s Website
- Jeffrey E. Boyd’s Paper on Video Interaction for Swarm Art
- Ars Electronica Brochure 2009 (Including Yunsil Heo and Hyunwoo Bang’s Oasis: II on page 11)