Music Technology
Music technology encompasses the use of electronic instruments, computer software, and sound manipulation tools to enhance music composition, performance, and recording. This field has significantly broadened personal expression in music, allowing artists to create innovative sounds through hybrids of traditional instruments and electronic devices. Key features include composition software that offers automatic notation and playback, as well as advanced editing and mixing capabilities. The evolution of music technology has its roots in historical developments, from early acoustic instruments to groundbreaking electronic innovations like the theremin and synthesizers.
As technology advanced, tools such as digital audio workstations (DAWs) became widely accessible, enabling both professional and amateur musicians to produce and manipulate music more effectively. Music technology also intersects with fields like music cognition and interactive systems, exploring how technology can deepen the connection between artists and audiences. Today, the rise of artificial intelligence (AI) in music creation is transforming how music is composed and produced, offering new possibilities while raising questions about authenticity and artistry. Overall, music technology continues to evolve, influencing various genres and making music creation more inclusive and accessible.
Music Technology
Summary
Music technology is the application of computer software, electronic instruments, and other sound-manipulation equipment in the composition, performance, and recording of music. The boundaries of personal expression through music are expanding with computer-instrument hybrids, such as those based on the piano and the guitar. Software facilitates the composition of scores with automatic notation and immediate playback features. Technology also allows the precise reproduction and efficient storage of music as well as advanced editing, mixing, and mastering capabilities. It even makes music more accessible, such as the creation of ring tones for cellular phones and electronic music effects for video games.
Definition and Basic Principles
Music technology is the integration of personal expression through instrumental and vocal music and cutting-edge applied science that enhances rather than replaces traditional music theory, composition techniques, and philosophy. It also facilitates the invention of novel instrumental and choral sounds and styles of music. Music technology often incorporates mathematical underpinnings to create synergy in composition and performance among electronic instruments, such as synthesizers, sequencers, and samplers, and classic instruments. Musicians now have access to technical equipment that enables them to make stronger harmonic decisions and more fully manifest their artistic visions.
![Modular Synthesizer. By Maschinenraum from Flickr (Source) [CC-BY-SA-2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons 89250524-78476.jpg](https://imageserver.ebscohost.com/img/embimages/ers/sp/embedded/89250524-78476.jpg?ephost1=dGJyMNHX8kSepq84xNvgOLCmsE2epq5Srqa4SK6WxWXS)
Music technology is often confused with sound technology. Sound technology involves the reproduction of sounds with adjustments for volume and clarity, and is applied in the development of hearing aids and acoustic halls. Music technology may also sometimes be differentiated from the closely related audio engineering, or recording engineering, which is concerned with the capturing, editing, and storing of music or other audio files. Music technology takes into account aesthetic composition and innovative performance, as well as reproduction with fidelity.
Background and History
In the broadest sense, any technological development used in music can be seen as music technology. The development of instruments, from prehistoric drums to the classical forms of the violin and piano, has traditionally been one of the driving forces in musical innovation. Beginning in the late nineteenth and early twentieth centuries, however, with most acoustic instruments well established in standard designs, the leading edge of technology and music began to focus mainly on electronics. A major breakthrough was the invention of the triode vacuum tube in 1906, which allowed the development of new electronic musical instruments.
In 1920 Russian engineer Léon Theremin invented the instrument that bears his name, which was patented in 1928 and went on to become iconic as a wholly electronic instrument. The theremin consists of two loop antennae that control oscillators; one regulates amplitude (volume) and the other regulates frequency (pitch). The antennae sense movements of the performer's hands without actual physical contact to create changes in the electronic signal and the resulting eerie music is amplified and transmitted through speakers.
The 1930s saw further developments, including the invention of recording with magnetic tape, which would allow many new techniques such as sampling and sound effects, and the digital coding method called pulse code modulation (PCM). Meanwhile musicians began to experiment with electronic means of amplifying acoustic instruments in order to achieve greater volume in band settings. The earliest experiments with electric guitars started in the early 1930s, with the first recording of a magnetic pickup made in 1938. Electric guitars would go on to heavily influence popular music.
The instrument most closely associated with music technology is the synthesizer, with the first successful programmable example, the RCA Mark II, completed in 1959 at the Columbia-Princeton Electronic Music Center. Engineering physicist Robert Moog revived interest in the theremin in 1961, and applied some of its principles to the invention of the Moog synthesizer, introduced in 1964. His invention used electronic circuit boards to alter the sound of music produced by a keyboard, and was the first compact and portable synthesizer. It quickly caught on with the advent of rock and funk music styles, and was used in concerts by groups such as the Beatles and the Rolling Stones in the late 1960s. Moog is acknowledged to be an American pioneer in electronic music.
Many other designs followed, including widespread use of digital technology with the increasing popularity of synthesizers in the 1980s. One of the most influential devices was the Yamaha DX7 digital synthesizer that used frequency modulation (FM) to generate sounds; it was released in 1983, the same year that the Musical Instrument Digital Interface (MIDI) standard for connecting electronic instruments and computers was established. The wide variety of sounds available with advanced synthesizers greatly impacted music across all genres, and as the cost of such devices decreased they became easily available to all types of musicians. Computer technology evolved alongside digital synthesizers, and by the 2000s many software applications were developed for music synthesis.
The development of hyperinstruments began in 1986 in the Massachusetts Institute of Technology Media Lab under the direction of renowned composer and professor of media arts and sciences Tod Machover. The goal of the project was to find ways to use technology to increase the power and finesse of musical instruments for superior performers. From 1992, the focus of this project shifted toward creating innovative interactive musical instruments for use by nonprofessional musicians and students, thus making music technology more accessible to the general public.
How It Works
Composition. Aesthetic music compositions are typically layers of patterns that can be translated into mathematical expressions. Computer programs are able to perform this translation quickly, and the results can then be applied as objective parameters for developing new musical creations. Software can assist with rapid, clear musical notation and real-time playback during the composing period as well as with subsequent complex harmonious orchestration. Songwriting is similarly facilitated and appropriate vocal keys, harmonies, and arrangements can be easily derived.
Performance. Traditional instruments may be enhanced with computer parts that create novel yet related voices. Most electronic instruments are also available for live performance. A music synthesizer is an electronic keyboard instrument that produces electrical signals of various frequencies to generate new voices, some of which imitate other traditional instruments. Synthesizers operate on programmed algorithms and these synthesis techniques may be additive, subtractive, modulate frequency, or cause phase distortion. A music sequencer is a software program or an electronic device for recording, editing, and transmitting music in MIDI format. A music sampler is similar to a synthesizer, except that instead of generating music, it plays back preloaded sound samples, such as those from a sequencer, to compose as well as perform. Like synthesizers, samplers can modify the original sound signals to generate new voices, and many can generate multiple notes simultaneously.
Recording. In the recording process, which combines music technology and audio engineering, music may be modified in its analogue form or converted to digital data before alteration. Sound mixing is the process of blending sounds from multiple sources into a desired end product. It often starts with finding a balance between vocal and instrumental music or dissimilar instruments so that one does not overshadow the other. It involves the creation of stereo or surround sound from the placement of sound in the sound field to simulate directionality (left, center, or right). Equalizing adjusts the bass and treble frequency ranges. Effects such as reverberation may be added to create dimension.
Auditory electrical signals may then be sent to speakers, where they are converted into acoustical signals to be heard by an audience. Digital signals may be broadcast in real time over the internet as streaming audio. Otherwise, the processed signals may be stored for future reproduction and distribution. Analogue signals may be stored on magnetic tape or phonograph disc. Digital signals may be stored on a compact disc or subjected to encoding for storage on a computer or personal music player (for example, using the MP3 format).
Applications and Products
Digital Audio Workstations (DAWs). In the twenty-first century computer technology became increasingly powerful yet also more affordable and widespread. While analog music technology remained popular, digital recording became the standard. Crucially, digital audio workstations (DAWs)—software programs or standalone devices capable of recording, editing, and often producing digital audio files—came into general use among professional and amateur musicians, as well as professional recording engineers. DAWs allow users to not only capture sound from traditional instruments, but also powerfully edit any audio signal or file and even create "in the box" sounds with built-in software synthesizers. They essentially give the average user much of the technological capability once confined to high-end recording studios.
Hyperinstruments. Music technology products include hyperinstruments, a term coined in 1986 for electroacoustic instruments. They were originally intended for virtuosos; customized instruments have been fabricated for musicians such as Yo-Yo Ma and Prince. However, their accessibility is being expanded for use by nonprofessional musicians and children.
The theremin is an archetypal electronic musical instrument that is played without actual physical contact. The performer uses fine motor control to wave their hands in proximity to a pair of antennae to control the pitch and volume produced. The instrument responds to every movement, whether deliberate or unintentional. A chameleon guitar has interchangeable soundboards in the central cavity that allow the same body, neck, and frets to have a familiar feel while producing various voices that imitate other instruments. Hyperpianos yield MIDI data that are augmented as solo music or keyboard accompaniment carefully matched to other instruments and voices. With the hypercello, the bow pressure, string contact, wrist measurements, and neck fingering are measured, the data are processed mathematically, and an enhanced signal is generated. A hyperviolin makes no sound of its own but creates electronic output when it is played with a hyperbow. The speed, force, and position of the hyperbow are measured wirelessly, the resulting data are adjusted, and the sound is appropriately enhanced for consistently intentional expression. The hyperviolin and hyperbow have been seen in performances by virtuoso Joshua Bell.
Innovative Musical Productions. The technology that emerged from the Massachusetts Institute of Technology Media Lab was channeled by MIT professor Tod Machover into a new form of opera in the broadest sense, telling a story through music. The first project, The Brain Opera, debuted in 1996. Each night's event occurred in two parts. In the first half, the audience is invited to experiment with electronic instruments such as the Rhythm Tree (by punching a node on the tree, a thump comes from a nearby speaker), Gesture Walls (advanced theremins), and Harmonic Driving devices (video games that people drive to make musical choices). In the second half, the music created in the first half is incorporated into a multimedia presentation for a unique show every night. Its story is the psychological exploration of how we think about music. The project toured worldwide, finding a permanent home in Vienna in July 2000. The second project, Death and the Powers, debuted in September 2010. This one-act opera tells the story of an inventor, Simon Powers, who builds an electronic system into which he downloads his memories and humanity, which is then expressed by robots, animated bookcases, and a musical chandelier. Typically inanimate objects become personified with electronic music and voices in what Machover calls “disembodied performance.” Music technology is responsible for connecting audiences with the emotion of the story.
Careers and Course Work
In the United States, music technology degrees are offered at many colleges and universities. Students interested in seriously pursuing a musical technology career may choose to emphasize the musical aspects at a music-focused institution such as the Juilliard School in New York City, or focus on the technical side in an engineering program such as at the Massachusetts Institute of Technology in Cambridge, Massachusetts.
Students of music technology may pursue various levels of education. Although some associate's and doctoral degree programs are available, most are bachelor's and master's degree programs. Admission into these programs often requires the ability to read music, the ability to play a traditional instrument (piano is preferred and voice may be considered), some experience with computer hardware and software, and some experience with music recording. Nearly all programs offer courses in music history, traditional and contemporary music theory, music appreciation and critical evaluation, analogue and digital signal processing, recording, mixing, mastering, synchronization, songwriting, orchestration and ensemble performance, and music business. Many colleges and universities arrange internships with local recording studios and radio stations to provide practical application.
Graduates may pursue many career paths in music technology. One is studio production, engineering, and recording for film, television, radio, video, internet, and record labels. Another is composing, arranging, and orchestration for film, television, theater, church, advertising, and video games. Some choose to perform in live music venues. Some work in education and research, studying music design based on how the mind interprets pitch, rhythm, melody, and harmony. Similarly, some become music critics who must stay abreast of the current trends in both popular music and advancing technology. Another option is multimedia collaboration, integrating audio and video. This collaboration extends to Web design, digital video editing, and interactive media. A similar option is equipment design, creating the next generations of synthesizers, sequencers, and samplers. Another popular career path is music business and administration involving sales, marketing, and management for music retailers, production companies, and record labels.
Social Context and Future Prospects
Music technology is going in several exciting and novel directions beyond those of audio engineering. While audio engineering continues to develop music information retrieval systems, advanced digital signal processing, and mobile recording studios, music technology is pursuing more intimate avenues that will strengthen the bond between the artists and the audience.
One such area is music cognition. Technology is developing the capability to interpret and map expressive gestures made by professional musicians to personify electronic music. Similarly, researchers are mapping the brain's responses to different pieces of music to predict the effect of new compositions on an audience. Conversely, bioengineers are working to make brain waves audible so that happy, sad, fearful, or angry thoughts can be heard as musical patterns and subsequently evaluated by mental health professionals.
Another emerging area is interactive music systems, which take the audience from listening passively to thinking actively about how they react to music as they create it by making choices. This ranges from live musical performances, in which the audience is deliberately invited to participate, to blind research in which musical stairs are placed in proximity to escalators, and observers measure the rate at which pedestrians progressively choose to take the stairs, thus getting more exercise for the reward of music. Interactive music systems overall make music an inclusive experience.
Bibliography
Ballora, Mark. Essentials of Music Technology. Upper Saddle River: Prentice Hall, 2003.
Brown, Andrew R. Computers in Music Education: Amplifying Musicality. New York: Routledge, 2007.
Hepworth-Sawyer, Russ, et al., eds. Innovation in Music: Performance, Production, Technology, and Business. Routledge, 2019.
Hosken, Dan. An Introduction to Music Technology. New York: Routledge, 2011.
"The Impact of Technology on the Music Industry."Southern Utah University, 2 Feb. 2021, online.suu.edu/degrees/business/master-music-technology/tech-impact-music-industry/. Accessed 10 June 2022.
Katz, Mark. Capturing Sound: How Technology Has Changed Music. Rev. ed. Berkeley: U of California P, 2010.
Newby, Kenneth. "Music Technology: A Timeline." Wired. Condé Nast, 29 Oct. 1997. Web. 26 Feb. 2015.
Middleton, Paul, and Steven Gurevitz. Music Technology Workbook: Key Concepts and Practical Projects. Burlington: Focal, 2008.
Roads, Curtis. The Computer Music Tutorial. Cambridge: MIT P, 1996.
Swain, Tooshar. "Music Technology in a 21st Century Economy." National Association for Music Education, 16 May 2017, nafme.org/music-technology/. Accessed 28 Jan. 2021.
Townsend, P. D. The Evolution of Music through Culture and Science. Oxford UP, 2020.
"What Can I Do with My Degree in Music/Music Technology?" University of Kent Careers and Employability Service. 9 Oct. 2019, www.kent.ac.uk/ces/student/degree/music/index.html. Accessed 28 Jan. 2021.
Williams, David Brian, and Peter Richard Webster. Experiencing Music Technology. 3rd ed. Belmont: Cengage Learning, 2008.