Composing music is an art form designed to capture audiences' attention. A great soundtrack elicits reactions that visuals cannot accomplish on their own. Whether it was Mozart composing chamber music for the elite or John Williams creating a score for a classic Spielberg movie, the formulation of music always focuses on engaging people.
The same is true when composing music and creating overlays for theatrical advertising. It should tell an engaging story to appeal to the audience (often with a wide demographic). However, the extensive vocabulary and jargon in the world of music and audio can create some confusion.
We have put together a collection of a few musical terms that editors and music supervisors should know and how these can garner better results for composers.
1. Mixing
Mixing typically happens after composing and recording, where the individual tracks (instruments) are blended together. The mixing engineer's job is to make all the efforts to carve and create a perfect, cohesive score. Mixes for every medium are different, and it is best to work with an experienced music production company that understands the details of the specific needs of mixing in your medium. For instance, a song mixed for radio play will differ from a song mixed for use on a television show, which will further differ from a song/cue mixed for use in a film or video game trailer. Because of tight turnarounds, composers for theatrical advertising often mix their own music.
2. Mastering
Mastering is the final stage of audio production and is a process that involves preparing the mixed track (stereo, 5.1, 7.1, or other formats) to prepare for distribution; it's the icing and decorations on the cake. It means adding finishing touches to the track to ensure consistency for qualifying it to go on air or be released to the public. The mastering engineer is usually a different person than the mix engineer and the last line of defense before the piece goes to the movie studio or record label. For fast turnaround custom projects, the composer often performs mastering as well.
3. Compression
In simple words, the compression narrows the difference between the softest and the loudest parts of a track to achieve a consistent level. Thus, it is a significant audio effect as it regularly touches every aspect of digital music creation, ranging from sound design to mixing and mastering.
4. EQ
In music production, EQ represents equalization. It is a plug-in or piece of hardware used to manipulate the frequency content of your music and sound design recordings, enabling all the elements of your track to work together sonically. Every note played by an instrument has a fundamental frequency and overtones above it that offers a specific timbre to make it recognizable. EQ is used to carve out space for each instrument or sound to resonate to its fullest form among all of the other tracks.
5. Distortion
The music production software is known to limit how loud an audio signal can be. Thus, distortion happens when an audio signal exceeds the set limit and produces a raspy, loud sound. While an unwanted distortion can negatively impact the mix quality, a deliberate distortion can help you brighten and add character to elements like synths and brass stabs.
6. Reverb
Reverb is generally produced when a sound occurs in a space, sending its waves out in all directions. Then, these waves reflect off the surface, decaying the amplitude until the reflections fade. This is often referred to as echo, which should not be confused with delay, which is the effect that makes a sound repeat. There are several ways to create a reverb in music production that can enable you to add flexibility and control to your music. In a digital workstation environment, which is what 99.99% of music is created in today, there are two popular reverb types. The first is convolution reverb, where a real-world space such as a church or music hall is sampled and emulated by software. Convolution reverb samples are referred to as impulse responses or IR. The second reverb type is algorithmic reverb, which is created by coded software plugins.
Though it's not critical for you to know the technologies of creating a reverb, you may impress your composer if you want to emulate a real-world sound, such as The Disney Music Hall. You can then ask them to confirm if they have a good convolution plugin to achieve what you're going for.
7. Tempo/ bpm
In musical terms, the tempo is the speed at which a piece of music is played. It is typically communicated to the players in three primary ways: modern language, Italian terminology, and Beats Per Minute (BPM). BPM assigns a numerical value to the tempo. Thus, it quite literally represents the number of beats played per minute of music. It is the most accurate way of indicating a fast rhythm or slow tempo. Though, sometimes if you want something to feel faster without changing the character of the music, you may opt to ask the composer to try using Double Time in their rhythmic elements. When an instrument plays double time, it is playing twice as fast as the rest of the cue, which is a great way to create a sense of urgency or increase the overall energy.
8. Instrumentation
Instrumentation is the art of integrating different instruments based on their potential to produce several timbres or colors in any musical composition. This can include diverse elements such as the countless combinations used in jazz bands, chamber groups, and symphony orchestras.
9. Timbre/Tone
Timbre is the quality of the musical note that distinguishes the tone color of a sound or a combination of sounds. Timbre can be interchanged with Tone when describing a single instrument. But when describing the overall sound of a mix or master, Tone is the better word to use.
Round, bright, bity, woofy, honky, smooth, muted and aggressive are a few words that can better describe the timbre of a single sound or instrument. For instance, if you had a mellow brass part, it could be described as muted, dark, round, or soft. Whereas heavy or strong brass parts could be described as bright, bity or aggressive. It is helpful to the composer if you can point to a specific instrument or sound when referring to the timbre. If you cannot decipher what the instrument is, you can use these terms in a broad sense while pointing out that it is in fact a particular sound, and not the overall mix, that is causing you issue.
10. Needle Drop vs. Unlimited Use License
Needle drop is a term derived from a time when playing a song required placing a needle onto a record. In the context of music sync, it refers to each time a song or cue is used in video production. A needle drop music license requires a producer to pay a fee each time they synchronize a tune with a film, TV show, or commercial. This means each time a song starts and stops, the video producer pays a new sync fee.
However in theatrical advertising, the license is often an Unlimited Use license, with which the film or game studio pays one fee that covers the use of a song for an entire campaign or unlimited uses in a single trailer. Film trailers tend to license for the campaign, and video game trailers opt to license per spot. If you are required to clear music for your projects, hiring a music licensing company is recommended to navigate the ins and outs of obtaining the correct license for your project.
About us
At Sencit, we live and breathe music and are experienced in sound design for theatrical advertising. We take pride in complying with our clients' expectations, regardless of the size of the project or campaign. From composition to supervision, our years of experience and understanding of our craft allow us to deliver full-scope music and sound solutions of the highest quality.
Reach out to us at 310-774-0123 or fill out the contact form for more information.