3d sound space - nsf reu: interdisciplinary research...

1
Gesture sensing video game technologies such as the Nintendo Wii and Xbox Kinect have provided a new area of experimentation for both computer music composers and gamers alike. In some applications of the live composition of computer music, members use Wii-motes to create sounds in predetermined ways. Likewise in the gaming community, video games such as Wii-Music allow users to use their Wii-motes to create sounds with on-screen instruments. Yet, users have not been able to create music by designing their own musical environments in 3D space. The Microsoft Kinect generates a 3D depth map using infrared structured light. This data is passed into the OpenNI framework where skeletal tracking algorithms generate positional information for nearly 15 joints. Simple OpenNI relays the informa- tion to Processing, where visual rendering, audio parameterization, and state machine maintainence is performed. From Processing, OSC messages are sent to SuperCollider, which provides real-time audio synthesis based on synthesizer definitions. Open Source Software: OpenNI, Processing, SuperCollider, Simple OpenNI This material is based upon work supported by the National Science Foundation under Grant OCI-1005165 Microsoft Kinect: OpenNI: Simple OpenNI: *Processing: *SuperCollider: OpenSoundControl (OSC): A XBox gaming platform device which uses infrared light to generate a 3D depth map (among RGB and microphone data) Provides skeletal tracking middleware and Kinect drivers Uses OSC to relay joint position data to Processing A Java-based programming platform focused on computer graphics. Project use includes visual rendering, audio parameterization, and state machine maintainence A programming environment and language which defines and runs audio synthesizers A standard communication protocol used in digital media A user will be able to create meta-instruments in 3D space which can produce sounds statically and be provoked by the user through his or her interaction. The user will be able to manipulate meta- instruments to explore a wide variety of timbres. Users may then create music by playing or performing in their 3D Sound Space. The final deliverable will maximize utility for the student, musician, and composer. The Microsoft Kinect produces a 640x480 depth map which is rendered by Processing. A user may calibrate the OpenNI skeletal tracking system and begin visualizing their skeleton manipulating 3D meta-instruments in digitally rendered 3D space Sphere objects perform FM synthesis based on their location. WAV objects play audio files while rate and pitch modulating values are parameterized based upon location. A sphere object is also activated in Run mode by one-to-one provokation with the hands Sphere and WAV objects produce sounds during movement in space as well as through interaction with the user’s hands while in statis Student: Principles of FM synthesis as well as pitch and rate modulation can be studied Musician: Wide timbral variance is acheived with FM Synthesis as well as the ability to play and filter WAV recordings Composer: Classical audio synthesis techniques are performed in newly interactive ways Background 3D Sound Space 3D Sound Space An Interactive and Customizable Meta-Instrument using the Microsoft Kinect Michael Carney, Stephen David Beck, Jesse Allison, Christian Dell, Yemin Oh, Dr. Randall Hall System Design Components 3D Interactive Customizeable Multi-user Results Meta-Instruments and Parameterization System Operation For 3D Sound Space -Use of extended audio filtering techniques -Use of parallel and series FM synthesis -Use of 3D digital display technologies In all fields Multi-disciplinary educational potential ex) Viewing the solar system ex) Interacting with molecules to create chemical bonds Use of the Kinect RGB camera as well as multi-array microphone Initialize skeletal tracking and menu as instructed. Menu navigation is done with right hand while left hand may select by ‘clicking’ with the left hand in the forward direction (towards Kinect sensor). Design mode allows sphere creation and placement through left hand ‘click’ gestures. Each object begins attached to right hand and is placed upon ‘click’. Run mode allows spheres to be ‘played’ by touching them with hands. The “Selection” and “Options” buttons are accessed by raising both arms towards the button. Selection mode allows users to re-place spheres during performance. Wii Remote Skeletal/Motion Tracking Microsoft Kinect GRIP Maestro Pong Objective Sphere WAV carrier freq. multiplier pitch modulation rate modulation modulating freq. multiplier z: modulation index Acknowledgements Future Work

Upload: others

Post on 25-Aug-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 3D Sound Space - NSF REU: Interdisciplinary Research ...reu.cct.lsu.edu/documents/2011-posters/Carney poster final.pdf · Gesture sensing video game technologies such as the Nintendo

Gesture sensing video game technologies such as the Nintendo Wii and Xbox Kinect have provided a new area of experimentation for both computer music composers and gamers alike. In some applications of the live composition of computer music, members use Wii-motes to create sounds in predetermined ways. Likewise in the gaming community, video games such as Wii-Music allow users to use their Wii-motes to create sounds with on-screen instruments. Yet, users have not been able to create music by designing their own musical environments in 3D space.

The Microsoft Kinect generates a 3D depth map using infrared structured light. This data is passed into the OpenNI framework where skeletal tracking algorithms generate positional information for nearly 15 joints. Simple OpenNI relays the informa-tion to Processing, where visual rendering, audio parameterization, and state machine maintainence is performed. From Processing, OSC messages are sent to SuperCollider, which provides real-time audio synthesis based on synthesizer de�nitions.

Open Source Software:

OpenNI, Processing, SuperCollider, Simple OpenNI

This material is based upon work supported by the National Science Foundation under Grant OCI-1005165

Microsoft Kinect:

OpenNI:

Simple OpenNI:

*Processing:

*SuperCollider:

OpenSoundControl (OSC):

A XBox gaming platform device which uses infrared lightto generate a 3D depth map (among RGB and microphone data)

Provides skeletal tracking middleware and Kinect drivers

Uses OSC to relay joint position data to Processing

A Java-based programming platform focused oncomputer graphics. Project use includes visual rendering, audio parameterization, and state machine maintainence

A programming environment and language whichde�nes and runs audio synthesizers

A standard communication protocol used in digital media

A user will be able to create meta-instruments in 3D space which can produce sounds statically and be provoked by the user through his or her interaction. The user will be able to manipulate meta-instruments to explore a wide variety of timbres. Users may then create music by playing or performing in their 3D Sound Space. The �nal deliverable will maximize utility for the student, musician, and composer.

The Microsoft Kinect produces a 640x480 depth map which is rendered by Processing. A user may calibrate the OpenNI skeletal tracking system and begin visualizing their skeleton manipulating 3D meta-instruments in digitally rendered 3D space

Sphere objects perform FM synthesis based on their location. WAV objects play audio �les while rate and pitch modulating values are parameterized based upon location. A sphere object is also activated in Run mode by one-to-one provokation with the hands

Sphere and WAV objects produce sounds during movement in space as well as through interaction with the user’s hands while in statis

Student: Principles of FM synthesis as well as pitch and rate modulation can be studiedMusician: Wide timbral variance is acheived with FM Synthesis as well as the ability to play and �lter WAV recordingsComposer: Classical audio synthesis techniques are performed in newly interactive ways

Background

3D Sound Space

3D Sound Space

An Interactive and Customizable Meta-Instrument using the Microsoft KinectMichael Carney, Stephen David Beck, Jesse Allison, Christian Dell, Yemin Oh, Dr. Randall Hall

System Design

Components3D

Interactive

Customizeable

Multi-user

Results

Meta-Instruments and Parameterization

System Operation

For 3D Sound Space-Use of extended audio �ltering techniques-Use of parallel and series FM synthesis-Use of 3D digital display technologies

In all �eldsMulti-disciplinary educational potentialex) Viewing the solar systemex) Interacting with molecules to create chemical bonds

Use of the Kinect RGB camera as well as multi-array microphone

Initialize skeletal tracking and menu as instructed. Menu navigation is done with right hand while left hand may select by ‘clicking’ with the left hand in the forward direction (towards Kinect sensor). Design mode allows sphere creation and placement through left hand ‘click’ gestures. Each object begins attached to right hand and is placed upon ‘click’. Run mode allows spheres to be ‘played’ by touching them with hands. The “Selection” and “Options” buttons are accessed by raising both arms towards the button. Selection mode allows users to re-place spheres during performance.

Wii Remote

Skeletal/Motion Tracking

Microsoft Kinect

GRIP Maestro

Pong

Objective

Sphere WAV

carrier freq. multiplier pitch modulation

rate modulationmodulating freq. multiplier

z: modulation index

Acknowledgements

Future Work