Case Study Framework

Case Study Framework

It is essential for any professional to have a plan with which to attain their goals. "To be in control of crafting the artistic product, one must be in control of the recording process, be fluent in the ways the recording process modifies sound, and to be skilled in executing well-defined creative ideas," (Moylan, 2006). For audio professionals reference tracks set a guideline for the end product of a song, whether it be for composition, mixing or mastering. It only makes sense to be able to pick apart a reference track for all its hidden subtleties. 

In this framework I will set a guide for myself to breakdown a song into its various elements in order to analyse them more accurately. This will serve as an important tool that I intend to utilise in future projects.

Critical Listening

"Inconsistencies between the various states of sound are present throughout the audio recording process. Many of these inconsistencies are the result of the human factor: the ways in which humans perceive sound and interpret or formulate its meanings. In order for the artist (audio engineer/recordist) to be in control of the artistic processes they must understand the substance of their material: sound, in all its inconsistencies," (2006). Of course there are numerous tools out there, both analog and digital, to assist in achieving this. Such as: EQ, compression, reverb, etc. However, to properly make use of these tools we must focus on the key characteristics of music including: frequency content, dynamics, composition (Key, tempo, time signature, chord progression, structure), instrumentation and soundstage. Critical listening is the method, in which, I will be identifying these characteristics. 

Spectral Balance

The human ear can perceive sound within the frequency range of 20Hz - 20kHz. But many instruments (and voices) do not occupy the full range of human hearing. For instance, both male and female voices occupy a similar frequency range, yet most male voices would possess some lower frequencies not usually found in female vocals, which can be visualised in the diagram below. However, we perceive some frequencies clearer than others. Also not all frequencies in an instrument's frequency range will be integral to its timbre. For this reason sounds must be spectrally balanced with one another otherwise some elements may get lost in the mix.

"To determine the equalization or spectral balance that best suits a given recording situation, an engineer must have well-developed listening skills with regard to frequency content and its relationship to physical parameters of equalization: frequency, gain and Q. Each recording situation calls for specific engineering choices and there are rarely any general recommendations for equalization that are applicable across multiple situations. When approaching a recording project, an engineer should be familiar with existing recordings of a similar musical genre or some idea of the timbral goals for a project to inform the decision process during production," (Corey, 2012).

 I intend to be listening for these issues and how they have been addressed in my case studies. Concentrating on the way that different elements of a song are separated from each other will give a better indication of how to achieve a similar outcome in my future projects.

Frequency Reference Chart (www.independentrecording.net)

Frequency Reference Chart (www.independentrecording.net)

Frequency Analyser (Pro Tools)

Frequency Analyser (Pro Tools)

Dynamics

Dynamics can make a huge impact to the energy of a piece of music. "Engineers...understand that dynamic contrast is important to help convey musical emotion. It begs the question, if the level of a vocal track is adjusted so that the fortissimo passages are the same loudness as the pianissimo passages, how is a listener going to hear any dynamic contrast?" (Corey, 2012). Louder sounds obviously bring more energy and quiet sounds carry little energy. Yet a quiet section of song preceding a louder section will feed more energy onto that louder section. In this same way a louder section of music that transitions into a quiet section can will have the audience listening intently in hope the song will rise again, which can create an intense and intimate moment in the piece of music.

By identifying a song's dynamic range and the way that instruments interact within that range will define how I can manipulate the energy within my own mixes/compositions.

Composition

"In order to achieve a great mix you should start with a great song," said every audio engineer ever. A great mix of an ordinary song will generally not be as good as an amateur mix of a great song. This is why it is important to evaluate the composition of a song. This includes the way a song is structured; the instrumentation used; key signature; chord progression; time signature; and tempo. identifying these elements gives focus in how to tame the elements of a song from its origin. 

To aid in identifying these composition elements I have chosen to use Pro Tools as it is my preferred DAW for mixing and recording and I am most familiar with its features. Pro Tools also gives a visual representation of the different composition elements. Below is an example of a composition evaluation of song with minimal instrumentation that I will be using in a future case study. It can clearly be seen the different elements have been identified visually such as: song length (mins:secs), bar count, tempo, time signature (and time signature changes), key, structure, dynamics, instrumentation and down the bottom I have used a volume automation line to signify the energy shifts throughout the song.

[Click to enlarge image]    Song structure/composition/instrumentation (Pro Tools)   

Soundstage

Creating 'space' in a mix can be difficult. But it is important to give the listener a realistic perception of the sounds that they're hearing. "Delay and reverberation help create a sense of depth and distance in a recording, helping to position some sound sources away (i.e. upstaging them) while other less reverberant elements remain to the front of a phantom image sound stage. Not only can an engineer make sounds seem farther away and create the impression of an acoustic space, but he can influence the character and mood of a music recording with careful use of reverberation. In addition to depth and distance control, the angular location of sound sources is controlled through amplitude panning," (Corey, 2012). This is also helps to put the listener 'in the moment' to reach them on a more emotional level. Identifying how elements such as stereo imaging, reverb and delays will lead to a controlled oral environment for listeners of my work.

 

Analytical Listening

Identifying the critical elements of a recording is a great way to recreate the energy and sound of song. But it's only the first step. The analytical side of things is just as crucial. Artists create music for a reason and as an audio professional it's important to find out what that reason is. "Meaning is the essence of every music production. If the meaning of a song does not come across clearly in the production, you will confuse the listener. The meaning of song is conveyed by adapting the characteristic feeling into the instrumentation. The setup of each instrument and way you choose to record them will greatly influence the way those instruments express themselves," (White, 2013). If you don't question why other artists are creating music then you're probably not questioning yourself why you're making music. It's important to have purpose to achieve goals. 

Questioning an artists motive involves researching: 

  • Where were they in their music career and how their music was evolving?
  • Who/What were their influences?
  • What is the history of the genre?
  • What are the elements that distinguish this as a genre?

References

Suiriryoku. (2011, Oct 24). Tycho - Daydream [Video File]. Retrieved June 18th, from https://www.youtube.com/watch?v=UFr9StkVwTk

Moylan, W. (2014). Understanding and Crafting the Mix. Independence, US: Focal Press. Retrieved June 12th, from http://www.ebrary.com

Corey, J. (2012). Audio Production and Critical Listening: Technical Ear Training. Burlington, US: Focal Press. Retrieved June 17th, from http://www.ebrary.com

White, M. (2013). Critical Listening vs. Analytical Listening. Music Production Guide. Retrieved June 12th, from http://www.music-production-guide.com/critical-listening.html

Interactive Frequency Chart [Image]. Retrieved June 12th, from http://www.independentrecording.net/irn/resources/freqchart/main_display.htm

BlueCat's Frequencyanalyst 2 [image] (2014). Blue Cat Audio. Retrieved June 16th.

Pro Tools 12 [Image] (2015). Avid. Retrieved June 16th.