Having released a music app this summer that sculpts sound based on where the listener is located within the National Mall in Washington, D.C., the app-developing band Bluebrain released a follow-up this month that does the same for Central Park. You can’t hear it unless you’re there — but if you are, you can experience approximately 260 musical segments as you stroll around, each of which was written to be experienced in a particular spot.
To be clear, Central Park (Listen to the Light) is a different animal from Soundtracking, Soundtrckr, and other location-based music apps that let users tag places with songs or music stations. Instead, Blue Brain’s creations are pieces of music conceived and written from the ground up to change as the listener moves through a specific location.
“What you do in that particular form factor, it’s a whole new canvas that paint on,” says AOL co-founder Steve Case, somewhat cryptically, in the above video. “But it’s a new opportunity with a new technology and a new device to create a new kind of experience that can reach a new kind of audience.”
That might be the case, but writing an album to go with a place is much harder than writing an album that sounds the same everywhere.
“The idea wasn’t to create an album as an app — it was to create music for a specific location, and because of the iPhone’s built-in GPS capabilities, it was just the best way to realize that idea,” explained either Hays or Ryan Holladay, the brothers behind the app (they also perform as The Epochs). “But the process is totally different — I mean, writing, recording, and plugging it into the app and walking the park and going back and doing it again and again until it’s right.”
The closest relative to Bluebrain’s location-based albums is probably the videogame soundtrack, which alters itself as a character moves through a level. Except in this case, there’s no game.