How One of 2016’s Most Talked-About Video Games Brought Generative Music to the Masses
A still from No Man's Sky, courtesy of Hello Games.

FYI.

This story is over 5 years old.

Music

How One of 2016’s Most Talked-About Video Games Brought Generative Music to the Masses

‘No Man’s Sky’ sound designer Paul Weir’s rule-based music writes itself—literally.

Noise is the norm of our day to day existence. Even just a trip to the grocery store on a weekday afternoon offers a cacophony. There's the relentless squeak of rubber soles on linoleum, the distant whirr of shopping cart wheels, the dazed chatter of couples deciding what's for dinner, and the polyrhythmic beeping of the cash registers that guard any available exit. It's enough to make you panic, or at least set your teeth on edge as you're trying to find that one perfectly ripe avocado. But according to Paul Weir—the sound designer and composer behind a number of installation works and video games, including one of this year's most talked about games, No Man's Skyit doesn't have to be that way.

Advertisement

Weir works with a UK-based company called the Sound Agency that crafts "music systems" for commercial spaces, including banks, airports, and even the toy department of UK department store Harrods. The company's goal: to use sound to foster calm in the midst of usually stressful environments. The Sound Agency's systems rely on a technique called generative music—a form popularized by Brian Eno that involves using a series of computer programming rules to create a piece of music that draws random phrases from a library of sounds to essentially compose itself. The music slowly unfolds over time—it's also been called "emergent music"—and in Weir's hands, it's becomes a placid experience, the kind that encourages you to slowly exhale, center yourself, and continue going about your day.

A self-professed "total geek," Weir has long held a fascination with computers and video games. Around the time he started working in the industry, he began tinkering with a software called Koan that allowed him to craft simple generative pieces. He found the method interesting more for its practical applications than its supposed emotional effects. Since generative systems rely on rules, it was intuitive that he could apply that process to soundtracks for video games, where the music becomes responsive to the action that is unfolding on screen. So over time, he's made increasingly sophisticated systems for the games he's worked on, including the memorably-titled 2003 puzzler Ghostmaster, a game in the wildly popular Thief series, and now, No Man's Sky.

Advertisement

The latter game finds players springing from planet to planet in a universe that consists of 18,446,744,073,709,551,616 possible planets. In a similar vein to Weir's own work, the astronomical bodies and creatures we discover there are generated procedurally, according to a set of software rules—and the music provides a tonal balance. Drawing on a sound library composed by the high-drama post-rock band 65daysofstatic, the score feels alternately anxious, lonesome, and reflective of the unrestrained joy of infinite space travel. The feeling of overwhelming vastness only magnified when you realize that you'll likely never heard the same exact musical cue twice.

Speaking over the phone with THUMP, Weir emphasized generative music's ability to locate solutions to real problems. Video game music can be repetitive, so he created a system that doesn't repeat. Shopping malls can be hellish horrors, so he composes pieces that subconsciously influence you to slow down. In the process, he's arguably granted generative music a degree of mainstream exposure unprecedented in the work of his forebears—even Eno, who seems to mostly approach it as a theoretical exercise. On a September afternoon, Weir hopped on Skype to talk to THUMP about his relationship to generative music as a tool and its impact on his work as a sound designer for spaces both real and virtual.

THUMP: What was your first interaction with generative music as a concept?
Paul Weir: I probably started playing with generative music with this software called Koan, which Brian Eno was a big fan of about 15 years ago. Even now, it's the big brother of generative music. But I was doing some teaching at the time, and it was a useful tool. Alongside that, I started to work for the Sound Agency. Originally, they were a very conventional music production studio, but they quickly became a company focused on sound designing retail spaces. From the beginning, we've been working with banks, shopping malls, and airports to put music into those big spaces. Obviously, if you're in a space where you're going to hear music for a long time, either you need a lot of music, or you're going to need a system to cope with that, so we took a generative approach.

Advertisement

How do your systems work? I know there's generative music that composes itself from the ground up, but you can hear acoustic guitars and things of that nature in your pieces.
There's an enormous amount of research in academia of using things like genetic algorithms or Markov chains to make music. There's a whole bunch of different ways of doing it. I've never taken that approach. I let the composer compose, but give them a few tools, without getting in their way, to allow the music to play back in a generative way. The composer does have to adapt how they work to a certain extent, but I don't want them totally altering their workflow, because then you're not getting the composer, you're getting the technology.

So you're recombining recorded work?
Not quite like that. What I don't do is get people to compose in a completely conventional way and then try to fix it. It's more like making a demo track but then when you make the proper final music, you make sure that all your phrases are separated. There was a game I worked on called Ghostmaster in 2003. That was the first time I'd done a simple generative system in a game. We had "buckets" of sounds—bass or melodic or percussion. How often you'd pick a sound out of a group depended on what the game was doing. It wasn't particularly sophisticated, but it was an illustration that this kind of approach could work.

What drew you to generative music as a form artistic expression?
I didn't like that a lot of game music was really flat. Sometimes it'd be connected to game events, but sometimes it's just ticking away in the background. The music can become repetitious, but with [generative music] you're already breaking that cycle. It's a way of finding systems where the music can remain effective over longer periods of time, and also feel more connected to what's happening in the game.

Advertisement

How did you end up using a generative system in No Man's Sky?
[Since the game is] not a traditional linear game, it seemed obvious, given my background, that we'd write a generative music system. I knew we could do it without too much difficulty. The rules in the game are fairly varied. The intensity of the music changes [according to] a set of rules that's contextual. The challenge with No Man's Sky was having [65daysofstatic] write an album and then make a generative version of that album. I was really keen to [let them] just make an album without our interference so that it could sound like them. But then I worked really closely with them to go back to the recording sessions and start identifying the parts I thought would work.

What do you think about the emotional experience of generative music? Sound Agency's website talks about the salutary effects of placing these systems in public spaces.
There's a commercial incentive to do it. If people feel calmer and happier, they'll spend more time in the location, and they'll come back and spend more money. It's really as crass as that. We did a really early test at Glasgow Airport where we were able to measure the sales over a period of time—with our generative music, no music at all, or just normal commercial music. There's quite a significant bump putting generative music on there. Sometimes it's not so big a bump, but the feedback from people tends to be really positive.

Harrods is a great example. We just did the toy department. Every department effectively has autonomy, so there's different music styles in different rooms sometimes. That's not great! It's confusing! What you get is whatever the manager wants to listen to that day, and not what's right for the environment. We care about what's right for the environment and what people are listening to. Sometimes I'll choose a specific key, because there's already noise in that environment in that key. We try to be sympathetic to the space. Ultimately, it's really simple, just create a nice, calming, welcoming soundscape that reassures people, makes them feel happy.

Whether you're making music for public spaces or for No Man's Sky, you seem to be introducing a lot of people who wouldn't seek out generative music to the idea. You're operating outside of a high art context.
I feel very strongly [that] I'm not interested in high art. I think using generative music in very commercial applications makes a lot of sense. It's not done as often as it could be. If the game introduces the concept of generative music to a wider audience, that's fantastic. The idea of real-time, rule-based music is a very powerful technique, and if we help make it move it onward, great—even better if you [keep using] me to do it.

Colin Joyce is THUMP's Managing Editor. You can find him on Twitter.