In New Digital Musical Instruments Eduardo Miranda and Marcelo Wanderley focus on musical instruments that use the computer as the main sound-generating device. Such instruments often consist of a gestural controller driving the musical parameters of a sound synthesizer in real-time.
With very few exceptions, electronic musical synthesizers have been designed since the very early models to be played with a piano-like keyboard. Astonishingly, the appearance of the digital synthesizer and the definition of the MIDI protocol did not change this trend. It is not by chance that in pop-music parlance, performers who played the synthesizers were often referred to as keyboard players.
With the emergence of software-based synthesizers and the availability of increasingly faster and more affordable personal computers, musicians started to implement their own digital musical instruments and create demands for controllers other than the keyboard itself.
Over the last decade a community that remained largely marginalized by the academy and industry for a number of years has been the source of various interesting developments in this area. Apart from a few papers scattered in various journals and the proceedings of the relatively new NIME (New Interfaces for Musical Expression) conference proceedings, there is no concise literature on this fascinating new topic. This book is an attempt to document such developments and inform researchers and musicians interested in designing new digital musical instruments with control and interaction beyond the conventional keyboard paradigm.
The book is divided into five chapters. The first chapter discusses the notion of musical gestures, their acquisition and their mapping onto the variables of a synthesizer. The second chapter focuses on practical issues of gestural controller design and reviews various examples of gestural controllers. The third chapter follows with an introduction to sensors and sensor-to-computer interfaces. It reviews the application of these sensors in the design of various digital musical instruments and discusses methods for converting the sensor signals onto data that can be used to control software sound synthesizers. The following two chapters introduce the use of electrical signals produced in the body, such as nerves, muscles, and brain signals, to control music. It presents different types of biosignals and introduces basic techniques to render these signals useful for musical control. Finally, chapter five discusses an interesting route to new instrument design, which involves the provision of artificial intelligence in order to render such instruments interactive.
Eduardo Reck Miranda is a full professor of computer music at the University of Plymouth, where he heads the Interdisciplinary Centre for Computer Music Research (ICCMR) and is director of the Master in Interactive Intelligent Systems course. He published extensively on the topic of computer music with important contributions on algorithmic composition, sound synthesis, and artifical intelligence.
Marcelo M. Wanderley is currently Associate Professor in Music Technology
at the McGill University. His research interests include the gestural
control of sound synthesis, input device design and interaction, sensor
design and data acquisition, and human-computer interaction, and are reflected
in his various publications and presentations at international conferences.