For music fans, gesture music controls should mean the ability to skip songs and more by, for example, tapping a button on the steering wheel and waving a hand.
Gesture controls mean something else for musicians or anyone else trying to make music: the ability to shape, pan, record, trigger, play, scrub, and add effects to music. Some early experiments with this involve webcams (AiRGuitar) or motion sensors (Leap Motion, Microsoft Kinect and more).
Imogen Heap has done some really interesting work in the latter area with “The Gloves,” which you can see her play in this video from last February:
Two Cornell engineering undergrads have built a new music glove, called Aura, that pushes the envelope with a new technology where gesture-based music is concerned, because it uses precise electromagnets designed for medical and motion-tracking applications and the manipulation of 3-D graphics, made by Ascension Technology Corp.
These electromagnets measure the location and orientation of the player’s fingers up to 150 times per second, even if there’s a line-of-sight blockage, outputting that data in MIDI form. That makes these gloves capable of playing invisible instruments, controlling invisible effects, and triggering invisible samples with unprecedented precision, according to Aura’s creators.
Aura inventor Ray Li and programmer/technology wizard Michael Ndubuisi appear in the video below to explain what’s going on with this thing, and to demonstrate it a bit — but first, here’s some exclusive detail from Ndubuisi on what makes Aura different from Imogen Heap’s gloves (we’d suspected that the difference lay in the electromagnets):
“As far as tech is concerned, you’re right in thinking that the electromagnetic sensors that we’re using for our project represents a major difference between our project (in the state that you saw) and Imogen Heap’s gloves. The electromagnetic tracker offers a very low latency (up to 150Hz) tracking solution, that can resolve both the position and orientation of the endpoint sensors very accurately. Because it uses an magnetic field to detect the sensors, it also doesn’t require a line of sight, like the optical tracking of the Kinect. The speed and accuracy of the sensor makes a bit more versatile, and opens some cool possibilities like playing an instrument with continuous pitch (so you could do something like slide between pitches) among other things.
Here’s what Ndubuisi told us about that:
“We have actually started working on a successor to the Aura instrument, which is an interface that we are calling “SoundSpace.” For this phase of the project, we’ll be using the same electromagnetic tracking technology, along with some other tracking and feedback devices, such as accelerometers, and vibration motors for tactile feedback. SoundSpace was actually largely inspired by Imogen Heap’s awesome gloves along with other cool projects, like the V Motion Project.
“Though our project is constantly evolving, our current vision involves creating something along the lines of what Imogen demonstrated in that performance, something we envision as a virtual ‘workstation’ in 3D space, where a musician can create and manipulate entire songs on the stage, just using their gloves. The musician would have a range of instruments that they can play using the gloves, and they could record loops from those instruments, or from external sound feeds, like their voice or other instruments even.
“We want the musician to be able to ‘place’ these loops on stage, then literally reach out with their hands and grab them to manipulate them. They could change the volume, apply effects or even shorten loops, either one at a time or in groups, all using gestures with their hands. We imagine our interface will give the musician more control over the sounds they create however, as we want them to be able to play nearly any type of sound or melody they can think of, and to build and layer them in multiple ways.
“We also want SoundSpace to include a large visual component to provide feedback to the musician and audience, and to further captivate and engage the audience. As part of the set up we will have projection screens to display information, and we’ll be able to actually show these loops that the musician has placed on stage, represented in some aesthetically pleasing way, such a vertical or horizontal bars.
“The musician and the audience will actually be able to see when the musician reaches into these loops and manipulates them; for example, if the musician lowers the volume of a loop, you would be able to see the size of the corresponding bar shrink. In addition to the feedback-oriented graphics, we plan on incorporating visual representations of the instruments the musician is playing, along with some cool effects, using abstract images such as lines, circles, particles, etc.”
Hopefully, they will take this show on the road.
Updated: This article originally stated that the Aura gloves measured the fingers’ location and orientation at 150 kHz; it’s been corrected to 150 Hz.