Tuesday, March 19, 2013

Live and in Concert:
New Sound Controller Technologies
Enhance and Create Music

Creating and Controlling Sound
Following its mission of applying technology to a wide range of disciplines - including the performing arts - CU's ATLAS Institute recently presented performances that explored the use of controller technologies to produce sound and music.

On March 2, 2013, computer science Ph.D. student Charles Dietrich collaborated with the Boulder Laptop Orchestra (BLOrk), the ensemble-in-residence of the ATLAS Center for Media, Arts and Performance, in the creation of a piece called “Gestures.” In this piece, three music students moved hands and fingers in front of 3D cameras that were able to recognize the three-dimensional positions of the center of the hand plus fingers.

Three students control sound through the movement of their hands which are tracked by 3-D Intel cameras. Images projected on the screen behind them show how the camera's software is able to diagram the movement of their hands by tracking the center of their palms and positions of their fingers.

A Gesture-Based Music Controller
"Our hands are so expressive," Dietrich said. "A technology that can track and record their subtle movements can also help us develop new applications – in music, sculpture, visual arts and CAD programs. Those who study the body language of various cultures may have new tools for research as well."

The hands of doctoral student Charles Dietrich are read by an Intel 3D camera mounted to the top edge of his laptop. Inset image shows how his hands are interpreted by the software. Brightly colored lines are superimposed over his palm and fingers and change angle and shape as he moves. In this way, the software can detect whether hands are open or closed or shaped in a way that might trigger a change in the sound specified by the programmer/developer/musician.

The camera incorporated a depth camera and a standard color camera. The depth camera used infrared light and an infrared detector to determine depth by time-of-flight (ToF) of the light, similar to radar, but with light. The depth camera and color camera data were processed through Intel software to determine the hand landmarks.

Dietrich used data from the Intel-developed camera software to assign sounds to hand movements. For example, the altitude of the left hand determined note or pitch. Opening or closing the right hand determined whether a note was played. The altitude of the right hand controlled loudness. In this way, Dietrich’s gesture-based controller functioned as a musical instrument.

13 Pedals, No Hands
Another controller used at the same concert was developed and played by College of Music Ph.D. student Hunter Ewen. Dubbed the “SCREAMboard” (an acronym of “soloist-controlled, real-time, electronic audio-manipulation board”), the foot-operated musical interface was used for real-time composition and improvisation.

Doctoral student Hunter Ewen presents his foot-controlled controller that allows hands-free control of 13 sound parameters. “My goal is to have the controller allow electro-acoustic musicians a hands-free way to control all the special effects and sound features they could want. Achieving this with earlier tools has been a difficult, awkward and often expensive process,” Ewin said.
Thirteen pedals controlled recording, playing, looping (after being recorded, a sound or sound sequence is played back repeatedly as a “loop”), synchronizing loop lengths (managing the way that multiple loops fit together or align with one another), panning (controlling the way sound is thrown around a room given its arrangement of speakers) and volume.

In one piece, Ewen (shown at left and below at right) used his voice, a kazoo and several unconventional instruments to record and loop a variety of sounds that he layered into complex textures.
The colorful diagram of vertical bars projected on the screen behind the performers shows the status of the 13 different controller pedals.

In another piece, John Gunther (shown at left), associate professor of jazz studies and co-director of BLOrk, performed a solo piece using Ewen’s interface. He played his jazz flute, recording and looping melodic sequences that harmonized with one another to become a rich background upon which he improvised a solo.

From One Mouse to Many
John Drumheller, a CU College of Music instructor, director of music technology and BLOrk co-director, talked about controllers. “In the past we were limited to one slider or one controller when we used a computer and a mouse. We could control only one aspect of sound with that mouse. But now, with the introduction of the tablet, smart phones and touch screens, each finger can control a completely different sound or quality of sound.”

BLOrk co-director John Drumheller performs on a tablet during a December 2012 concert in the ATLAS Black Box theater.

It is now possible for our 10 fingers to control 10 sounds or aspects of sound (pitch, volume, rhythms, harmonies, etc.) simultaneously. Of course for centuries, the piano has allowed our 10 fingers to make 10 different sounds at the same time, but we are talking about the ability to assign a different sound, instrument, rhythm or dynamic to each finger. (This is similar to Hunter Ewen allowing us to assign and control 13 different sounds or parameters using his foot pedal control device.)

Four Controllers, One Touch Screen
In the music lab, Drumheller showed me his tablet for a moment. He set its screen up with four rectangles of different colors. As I touched the screen, I let my fingers slide around the surface, tap and wiggle. Instantly (much to my delight!), four different exuberant sounds were generated simultaneously as my fingers moved.

Soon, as Dietrich’s gesture-based technology progresses, we will be able to move hands and fingers in the air around us, unconfined by screen, tablet or keyboard. Our movements will instantly become sound. Dance and music, always complementary art forms, may become one.

~ ~ ~ ~
See additional photos of the March 2nd concert of the Boulder
Laptop Orchestra (BLOrk) that took place in the downstairs
ATLAS Black Box theater.

Get yourself on the ATLAS email list. ATLAS concerts, dance/theater performances, art installations and Speaker Series talks are always free and open to the public (well, with very few, rare exceptions).

Look into a list of upcoming ATLAS events.
Perhaps there are several you'd like to attend.


Contact the writer: Ira Liss is ATLAS Institute's assistant director
of communications and also a pianist, singer/songwriter and performing artist. Enjoy videos of his original work.