Augmented Sound Ensemble for Instruments and Objects

Project supervisor: Christos-George Michalakos

Abstract

Sensor technology allows us to electronically enhance the sound of instruments and inanimate objects. By capturing and using data from gestures and the sound of the physical performance, we can augment the palette of our instrument in real time using digital signal processing techniques. For example, an accelerometer applied on a guitarist’s wrist can provide information about the hand’s movement, which can be used for the control of a filter parameter.

In this project, you are invited to create individual or collaborative electronically augmented instruments, using enhanced traditional instruments or repurposed found objects. As a group, you will explore mapping and networking strategies between individual responsive systems, developing architectures towards composition, performance and improvisation as an augmented ensemble.

Basic electronics, sensor systems and the programming language Max/MSP, Jitter, are some of the tools that can be used for the realization of the project.
Aims and Objectives

  • Develop a basic augmented instrument or sound-making object. Note that the focus of this brief is on performing with instruments that are directly coupled with the laptop rather than using the laptop as an instrument in itself.
  • Combine the individual augmented instruments in order to develop a collaborative compositional and performance strategy for the ensemble.
  • Create a final performance.


Learning outcomes

  • Become familiar with the use of sensors such as piezoelectric microphones, accelerometers, cameras, arduinos and digital signal processing techniques.
  • Experiment with different mappings.
  • Learn more about sound and data exchange over a network.
  • Use Max/MSP, Jitter to build an interactive performance system.
  • Work and collaborate within a group of performers, although not everyone in the group will necessarily perform.

Presentation suggestions for both submissions

A short performance, well documented in either a webpage, DVD, containing audio, video, text, code.

Related Projects / Bibliography
Hyper-Kalimba
The K-Bow Data Screen

https://ccrma.stanford.edu/~mburtner/metasax.html

Digi Digeridoo
http://createdigitalmusic.com/2009/12/03/digi-didgeridoo-augmented-wireless-digital-instrument-with-aboriginal-roots/

Hybrid Percussion
http://web.media.mit.edu/~roberto/hybrid-percussion/

Augmented Violin
http://imtr.ircam.fr/index.php/Augmented_Violin/

MIT Hyperinstruments
http://opera.media.mit.edu/projects.html

Atau Tanaka
http://www.youtube.com/watch?v=FB_yE_Y3_8k

Augmented voices performance
http://www.youtube.com/watch?v=UDtm-YcX4uI

Found Object
http://en.wikipedia.org/wiki/Found_object

Princeton Laptop Orchestra
http://plork.cs.princeton.edu/

Cléo Palacio-Quintin, “The Hyper-Flute”, In proceedings of NIME’03, Montreal, Canada.

Carnegie Melon Laptop Network Orchestra (pdf link)

A. K. E. Yang and A. T. P. Driessen. Wearable sensors for real-time musical signal processing. In IEEE Pacific Rim Conference on Communications, Computers and signal Processing PACRIM, Aug. 2005

Hunt, A. et al. “The Importance of Parameter Mapping in Electronic Instrument Design” Journal of New Music Research, Vol. 32, No. 4, 2003 pp. 429–440

Verfaille, V. ; Wanderley, M. M. ; Depalle, P.”Mapping Strategies for Gestural and Adaptive Control of Digital Audio Effects” Journal of New Music Research 2006, Vol. 35, No. 1, pp. 71 – 93

Arduino.

http://www.arduino.cc.

MaxMSP, Jitter

http://cycling74.com/





Comments are closed.