By Ashley Yeager

Ken Stewart uses his motions and an XBox Kinect to narrate, musically, a dance by Thomas DeFrantz. Credit: Duke University Dance Program.

To watch Ken Stewart dance in front of his Xbox Kinect gives a whole new meaning to the “Dance Your Ph.D.” contest.

Stewart, a graduate student in the music department and a composer, is using the camera, along with specialized computer software, to narrate dance with sound. He demo’ed the program while walking an audience through his imnewhere, or I’m new here, composition of dance professor Tommy DeFrantz’s journey to Duke.

The Jan. 27 presentation was part of the Visualization Friday Forum and gave attendees a behind-the-scenes look at the research and mathematics behind Stewart’s new, “more expressive way” to write music.

With the Kinect, which has motion-detection technology for interacting with video games, Stewart can transform his gestures into sound, intimately controlling the loudness, pitch and rhythmic intensity of the score he creates. The system records 15 points on a controller’s body, including his head, neck, shoulders, knees and feet.

Using a library of sounds, the controller can then correlate and choreograph a composition, using the computer to calculate angles between his hands or distance between his body and the camera. These angles are converted to become the musical notes.

The work, Stewart says, gives him a way to use his ears and actions to “feel out” a song. He concedes that there are hiccups between how he moves and the sounds created, but, he says he thinks that the imprecision adds to the expressivity of the composing process.

Stewart said he and DeFrantz are still working on imnewhere. They plan to expand the piece to 15 minutes and will perform it again in Grand Rapids, Mich., Berkeley, Calif. and Belfast, UK.