The Future Sound of Vehicles
March 23, 2008

 
Dane Davis created sound for The Nebuchadnezzar from The Matrix by recording spark gaps on a 30,000 volt Jacob's Ladder. He also mixed in the sound of dry ice being pushed against sheets of steel in order to simulate an electromagnetic interaction between its propellers and steel body.

As far as soundtracks are concerned, vehicle sounds are one of the more crowd-pleasing details, alongside the sound of weapons, explosions, monsters, alien voices and similar ear-catching handiwork. These are the signature elements of intensive sound design and because they are often talked about, audio designers find them especially fun to create. The techniques of creating custom vehicle sounds are peculiar and oftentimes overlooked within the field of sound design, but it is interesting to focus on things not so general and redundant.   In doing so, I can deliver a bit of unique information that would not surface if I presented a broader study of sound design.

In this installment, I will first introduce some ways that well-known sound designers have approached the creation of propulsion sounds and then talk about some ways in which similar sounds could be processed in order to simulate movement. Finally, I will discuss how the sounds can be prepared into a special library that may enhance their usability in film and game production.

Video game sound designers and filmmakers working within sci-fi and action genres that intend to create their sounds from scratch rather than through field recording or mixing sounds from commercial sound effects collections will find this article especially helpful. Even if the reader is not concerned with learning new sound design techniques, this article may be of interest purely on the basis of how sound effects are created.

Sound-Producing Mechanisms

The most notable characteristic of a vehicle's sound is usually it's engine or steering systems - this is where the sound design can start. Scott Martin Gershin, sound designer and editor for numerous films and video games created the sound of a starship taking off in The Chronicles of Riddick, by bending the strings up on his electric guitar and sending it through a full Marshal stack and an assortment of harmonizers. When discussing the creative process, he emphasizes how important it is to experiment with things even if the results are not always successful because without a concerted effort to discover something unique, those happy accidents in sound design could never occur. Gershin states, "My washing machine at my house sits above a downstairs bathroom, and when the washing machine is on, it creates this interesting resonance in the bathroom. It feels like a starship or a submarine, or something to that effect...There is stuff that you can do in a home that would just blow your head away. Like taking [electric] razors and putting them in metal bowls and recording them. Or using the windows in your house on a windy day and opening them a little bit to make it sound like a wind storm or a hurricane or wind whispering across the wing of an aircraft. There are endless amounts of sounds that you can come up with."

Typically, high-tech vehicle sounds are created using malleable assortments of simpler sounds. In this way the sound of the craft's mechanical capabilities and maneuvers can be devised by adjusting the mix. Also, a layered mixture can assist with balancing the textural character. For the principle spaceship in Men in Black, supervising sound editor Skip Lievsay reveals, "I took a bunch of recordings of servo sounds--a jet takeoff, big motors from various recordings, some from libraries, some I recorded, some I got from a friend who loaned me his whole library. All those together make a nice combination of motor sounds. We wanted to de-emphasize the airplane sound and go more with an almost 50s sci-fi sound; it's a matter of just finding all the elements and combining them, finding the right size and texture."

Creating a vehicle sound with a lot of layers can also help the sound editors create a more dynamic and varied sequence. The lightcycles in the film Tron feature a combination of   sounds from   motorcycles and a Sequential Circuits Prophet 5. Sound designer Frank Serafine says, "I'd gone out and recorded motorcycles and processed them. I used sawblades when they turned corners, a lot of video games sounds when they'd hit the walls, a lot of pitch wheel gear shifting, using the synthesizer as my motorcycle. I tried to make it realistic. It's very difficult to create pass-bys with a synthesizer. The Doppler is the whole physics of sound, how it starts high and goes low. It's something that has to be recorded. So I went out and recorded hundreds of different motorcycles and pass-bys. I rode on a motorcycle and recorded it in special miking positions. I just did lots of experimenting."

According to Tron's supervising sound editor Michael Framer, a very low sound budget and an unpredictable visual effects pipeline made it nearly impossible to complete the highly stylized soundtrack on time. Serafine calculates that the film included over five thousand sound effects. It is obvious when observing the lightcycle scenes that many, many combinations of sounds were used to create a cohesive sequence. Each time the camera's perspective changes, one hears a new mixture of propulsion and ambience - it never gets boring. Ultimately, what makes Tron's soundtrack work is the ever-shifting blend of textures - the lightcycle sound effects are multi-dimensional and truly bring these machines to life. Serafine says that he approaches sound design like a film composer, planning out how he wants to layer the sounds on track sheets so that the sound effects composition is realized before the sounds are physically assembled to picture. Having created a mental image of the mixture in advance, he was able to work quickly and efficiently when it was time to synchronize his work to the completed visuals (224).

One of the more interesting vehicle sounds to show up in recent years was created by sound designer Dane Davis for the Nebuchadnezzar hovercraft (a.k.a. the Neb) from The Matrix series. The Neb does not feature thrusters, which are so common with futuristic flying vehicles. Instead, the outer surface of the craft is covered with electrical "propellers". In an interview with Richard Buskin of Studio Sound, he revealed how the sound was designed for the Neb's propulsion system. He explains,

We rented a six-foot-tall, 30,000V Jacob's Ladder, and I obtained the sort of Dopplering arc cycles that I needed for these propellers by recording this huge arc going very closely by the mic and making forward-reverse loops. I think there were ten propellers, and each of them had three forward-reverse loops of this Dopplering arc, and they were also pitch-shifted as samples. That's where all of that power comes from...There's also a lot of metallic resonating of the Nebuchadnezzar when it's moving, and, although that's propulsive it was all just produced with huge sheets of steel that were being vibrated with dry ice and things like that. So, that wasn't specifically electrical, but, being that the ship is electromagnetic, everything on the Neb is steel, and so that was another emphasis...I wanted to convey the idea that the electromagnetic field from those propellers was affecting everything--you know, like the steel was always being kind of pushed and pulled by the electromagnetic field, and so that's why I used the steel sheets.

For Battle Beyond the Stars (1980), supervising sound editor David Yewdall used a chorus of human voices to create the engines for one spacecraft. He explains, "It is the Community Choir from my hometown college of Coalinga. Choral Director Bernice Isham conducted her sopranos, altos, tenors, and basses through a whole maze of interesting vocal gymnastics, which were later processed to turn forty voices into million-pound thrust engines for the Nestar ship, manned by clone humanoids." (Yewdall 203) There were seven different ships in the film, each driven by a different race of beings. The ships sounded different from one another, which helped to elucidate the technology of each alien.

This concept is used extensively in Star Wars. Take, for example, the pod race scene in The Phantom Menace, which is in my opinion one of the best editorial sequences in the whole series. The pods, each driven by a unique creature, feature sounds that are texturally very different from one another. Ben Burtt says he wanted to "give each vehicle a personality. I consider the pilot of the craft and whether I want the audience to like or fear a certain ship or character. A pod sound can be powerful, angry, comical, smooth, cool, hip, old-fashioned, goofy, or dangerous. I try to make a sound that will relate to that type of coloration. Pod sounds were made from race cars, boats, warbirds, electric tooth brushes, shavers, motorcycles, rockets, and helicopters."

One thing I really appreciate about these sounds is the spatial dimension - the pods truly sound different from each viewing angle. The movement effects are especially realistic and even though there are dozens of them racing alongside each other they each stand out clearly. The dynamic range of the mix is extreme. Vehicles scream past the camera at full volume and fall into absolute silence as they reach the horizon, accentuating the aural punch of the next shot. The decision to not feature music in this scene created extra sonic space for the editors to work with - the vehicle sounds convey all of the emotion that a musical score would have to.

It is important that sound effects, no matter how impressive, are not so self-important that they pull the audience out of the story. Sound should mingle almost secretly with the screenplay and visual environment.

Ben Burtt has designed the most classic futuristic vehicle sounds in film. His methods are original and his early work helped to define the sound of the science fiction genre. The sound of the Tie Fighters is one of his most iconic spacecraft sounds, consisting mostly of a heavily manipulated elephant call and a car zooming by on a rain-slicked highway.

This car on a wet road ingredient can also be heard within the sound of Bladerunner's hovercrafts (also known as police "Spinners"), which race gently across the skies throughout the film, perfectly enriching the post-apocalyptic cityscape of a futuristic Los Angeles.

Spinners are as much a part of the visual aura of Bladerunner as the musical score. William Whittington, author of Sound Design and Science Fiction, describes how their tone fits into the soundtrack:

The romantic impulse is strongly supported by the sound design that accentuates the notion of flight. The liftoff is initiated with the chatter of processed radio voices and jet engines, which propel the hovercar upward. The Sounds dissolve into the score by Vangelis, which features an orchestration of spectral voices, chimes, and synthesized tones. The music carries the visceral impact of floating, soaring, and falling as it matches the movements of the hovercar. The composition blurs the line between sound effects and music, just as the mise-en-scene blurs any difference between the exterior and interior spaces. The spectacle of flight and movement is also accentuated as two other hovercars pass beneath Deckard and Gaff and the engines whir and slip by in a Doppler shift effect into the surround channels. The scene is framed by a radio-processed voice, a controller, guiding the hovercar into the station. The spectacle and grandeur overwhelm us viscerally and emotionally, setting the stage for this richly textured world.

This deep chemistry of sound, visuals, and story can only happen through a well thought-out collaboration between creative minds. The sound designers and editors need to understand the essence of the project and how the sounds will be used. Having a visual cue, such as an assortment of rendered stills or drawings, is very helpful to use as reference when the sound design process first gets under way. It is also helpful to test out these initial concepts on a bit of finished footage as early on in the project as possible. This intermingling of processes may improve the cohesion of all the different elements.

When sound designer Erik Aadahl created the transformation effects for the robotic vehicles in "Transformers", the process of designing sounds and editing them to picture overlapped, which helped him get a better sense of how his ideas would work. He explained that, "It all happened at the same time. The first scene I got was Blackout (at the time his name was Vortex) destroying the Qatar airbase. I had a week to come up with the transformation and weapons and destruction and the shape of that very first pass stayed pretty much intact until the end. After that first week, I had a chance to catch my breath and go conceptual again, spending my days under headphones recording everything that might be useful--scissorlift servos, remote control copters, sliding acrylic sheets, power windows--and then throwing them into ProTools to manipulate them into fun sounds. After a few weeks of that, I had a palette of several hundred fresh robot sounds that I could draw from as the movie progressed."  

 

Make Magazine recently featured a DIY jet engine project that can be done using a jam jar and some gas line antifreeze. Not only will this satisfy your inner pyro but it could also provide you with an assortment of gritty combustion sounds.

 

Clever decision-making is often the mark of good sound design. Someone that creates a mind-blowing vehicle sound out of an electric toothbrush is taking advantage of a very useful irregularity: sound pressure level (SPL) is not interpreted accurately in recordings. The size of the object recorded can be quite small and still produce a big sound effect.

For example, to create the sound of a powerful internal combustion engine, try using a jam jar and some gas line antifreeze or nitro model fuel. Drill a 4mm hole in the lid, screw it back onto the jar, pour in a bit of the fuel, shake it, and light it. The fuel will ignite with a pulsating roar that sounds like a powerful racecar engine. By covering up the hole slightly with a piece of metal, the frequency of combustion can be regulated to simulate a throttle. Be aware that the jar will get very hot and could explode. To be on the safe side pour a little water into the jar before lighting it. The water will keep the jar cool and the fuel will float on top of the water. Also, wear safety goggles!

To create the sound of a rocket flame, put a wind-sensitive condenser microphone in front of a fan so that the air is blowing directly on it. With a little added distortion, the recording will sound like a steady explosion of fire from a solid fuel rocket or ramjet. Depending on how fast the fan is rotating, where the microphone is positioned and how much distortion is applied, one can also achieve sounds like airplane propellers or helicopter blades.

Some hairdryers have a high frequency whistle that sounds like a jet turbine if recorded up close. One may have to isolate and boost this sound with an equalizer. To make the turbine accelerate for takeoff or wind down, just bend the pitch slowly. By mixing this whining noise with the sound of rocket combustion, a realistic turbofan jet engine sound can be simulated.

Adding Movement Effects

 

The GRM Tools Classic VST Bundle (Mac, PC) is an excellent package of motion effects plugins featuring a plugin called "Doppler" which includes a positional display showing left and right audio channels (red and green nodes) and the sound source (white node). One can create manual or automatic movements for all 3 nodes enabling sophisticated amplitude and Doppler variations.

The propulsion sound can be as simple as a short loop and then animated with movement effects so that it becomes more lively and 3 dimensional. In this section I will demonstrate some ways to get looped vehicle sounds moving through a scene realistically.

Ordinarily a sound engineer will reach for the gain control to position objects to the front and back of the mix and the pan control for left to right positioning. When things need to be in the background, cutting the high end a bit and adding a little reverb might do the trick. If something needs to travel along a path, such as footsteps going from the front left of the listener to the right rear, it is necessary to manipulate several of these parameters simultaneously. For the most part this method does give a realistic impression of movement. However, if you want to create the sound of something moving very fast or around other objects, you may want to dig deeper than the mixing desk. Likewise, a sound that swoops around in delicate patterns, like a bee foraging around for pollen, begs for more realism than panning and gain controls will provide.

With the help of additional effects processes, there is the capability to place and move objects in a 3 dimensional space more realistically. For instance, a very short stereo delay can be used to position an object to the left or right. Delay the left channel by 20ms and the sound will seem to come from a point slightly to the right.   This is one way that a monaural sound can be turned into pseudo stereo. By simply creating a copy of a sound and creating a time delay between the left and right channels the sound seems to leap out from the speakers. One can take this a step further and design a sound that consists of two distinctly different sounding waveforms that have their own panning, delay, equalization and reverb. The changing mixture of the sound in relation to viewing angle gives the depicted object (such as a vehicle) a more dimensional nature.

 

Wave Arts' Panorama (Mac, PC) provides the ability to move and position sounds around the listener, including up, down, forward and behind. It includes an optional crosstalk cancellation feature so that the effect can be heard on speakers (usually 3D effects only work on headphones). Panorama also includes Doppler and reverb, making it an extremely powerful package. The results are so believable that it may inflict motion sickness!

 

Some processors do a very good job of randomly animating sounds, one example being a chorus, which enlivens the input signal by mixing in one or more copies of itself that are pitch shifted, delayed and modulated. This type of effect can work well for vehicle propulsion sounds, especially when the effect gets more pronounced as the vehicle is moving and less pronounced when the vehicle stops.   Most plugins that are designed to add random movements to sounds tend to be on the gimmicky side but when they are used thoughtfully, the results can be very effective. The tricky part lies in understanding which processors should be doing what, how they should be combined and how they should be controlled to faithfully simulate movement within a scene.

Recently I experienced a happy accident of my own while I was recording wind sounds near the shore of a reservoir.   A C-5 airplane roared over my head. Planes are an immense disturbance to field recordists and this one was no exception, being one of the largest military aircraft in the world with four engines rated at 43,000 pounds of thrust each. Since this would put my project on hold for a few minutes I pointed the shotgun mic upward and captured it. A couple of weeks later when I was cataloging my wind samples I discovered the C-5 sample and gave it a listen. Here is a portion of the recording:

[passing jet]

It struck me how the sound of the plane changed over time, its harmonic structure melting like wax as it traveled from one horizon to the other. Later that day I was listening to cars zooming by and noticed a similar effect, much subtler, but again it was like a liquidisation of all the frequencies, an effect set in motion by the car's movement and influenced by the surrounding landscape. One similarity I noticed about the sound of moving aircraft and cars is that once they had moved outside of my sight, behind trees, the sounds were considerably less animated. The presence of objects in the surrounding landscape seems to contribute substantially to the way they sound when moving. Some of the sound from the C-5 was reaching my ears directly, and some was being reflected off of the ground, trees, rocks and water.   This mixture of direct and indirect waves produced constant variances in amplitude over all frequencies.

When creating the sound of a moving vehicle (especially flying ones), the emulation of this phenomenon can add quite a bit of realism. The best way to achieve this is with an equalizer by accentuating and cutting different frequencies over time. When done correctly, the effect will not be heard but the result will be much more realistic.

In the following example, I have used a white noise signal to create the sound of a rocket flying by from left to right. White noise is a random inharmonic waveform (with equal power at a specified bandwidth such as 20Hz to 20KHz) and a useful ingredient for many basic sound effects. Using filters, one can sculpt it into lots of different sound effects, such as cymbals, breath sounds, earthquakes, wind, ocean waves and flying daggers. It works well enough to simulate a rocket so I generated 5 seconds of it in Wavelab and imported the sound into my workstation for processing.

As a starting point, I created an extremely simple movement effect using only gain and panning controls. In this sample I controlled the panning so that as the sound plays it moves from left to right. I also used the gain control so that the signal builds up from total silence, ramps to full volume, then returns to full silence.

[passing white noise]

 

Parametic equalizers are useful for more things than mastering a final mix. One can actually use them to add extra realism to artificial motion effects. Pictured is the PTL-REQG2 Oxford EQ plug-in for Pro Tools-LE.

In this half finished state, the sound has about as much realism as could be found in an 8-bit Atari computer game in the early 80's. However, just a couple of very simple adjustments can make this into a rocket. I achieved good results with an 8 band parametric equalizer, a lowpass filter, and a phaser. I needed the capability to perform and record parameter adjustments in real-time because I had all 8 of the parametric equalizer bands moving around at once Plugins that are MIDI controllable handle such tasks nicely and most digital audio workstations (such as ProTools) provide this capabilty.

First, I spread all 8 frequency bands about somewhat randomly between 50Hz and 6KHz and gave each one a slightly different bandwidth, none of them overly narrow. The initial frequency of each band was not too important as the bands would be moving around. The graphical point where the frequency and amplitude of the equalizer band intersect is often referred to as a node (an 8 band parametric equalizer will have 8 nodes). By moving a node with a mouse pointer, one can adjust the frequency and amplitude of an equalizer band simultaneously. Using MIDI, it is even possible to record these node movements - this automation capability is necessary to achieve the following effect.

To simulate the way that the rocket's sound would reach the listeners ears directly and indirectly, I automated all 8 nodes so that they each one had a somewhat random motion to it.   This approximates the complex movement heard in the C-5 aircraft recording above. I recorded the node movements one at a time, for a total of 8 layers of control. Playing this back, the white noise came alive with movement.

I also added a subtle lowpass filter sweep so that when the rocket approaches the filter opens and when it has passed the filter closes. This helps to simulate air absorption. The further away a sound is, the less high frequencies will reach the listener.

To finish it off I added a notch filter, simulating the changing angle of surface reflection (from the rocket to the ground to the listener) along each point of the rockets path. This is commonly known as the ground effect . When a sound source moves along the ground some of the signal will go directly to the listener and some of it will bounce off the ground and reach the listener indirectly. When these two sounds combine they interfere with each other canceling certain frequencies out. A phaser effect is a narrow notch filter that sweeps up and down and this works well to simulate the ground effect. All I did here is create a notch that sweeps up as the rocket approaches and back down again after it has passed.

Here is the completed sound. Compare it with the previous sound which does not include movement effects and it will give you some appreciation of how much these three processes helped. My intention here is to demonstrate what equalization and filtering, as opposed to Doppler effects, can do when attempting to simulate the sound of moving objects. In fact, a Doppler effect would do nothing to white noise because it has no pitch.

[passing white noise with equalization]

The equalization and filtering is not meant to wow the listener but instead add just the right amount of realism. As disappointing as statements like this may sound: the effect sounds better if it is not noticed!

Imagine dipping the end of a stick into a pool of water. This causes a circle of ripples to expand and dissipate--think of this as sound radiating from a stationary point in space. If the stick is moved slowly in one direction the shape of the expanding ripples is no longer a circle but an oval. Now imagine that this is sound emanating from a moving source. The ripples move at their own speed, which is determined by the viscosity of the water, not the speed of the stick. As the stick moves faster it will start to catch up with the ripples in front and the ripples behind spread further apart. This is why a police siren rises in pitch when it approaches and drops when it has passed, an effect called Doppler shift. (Blondin 2)

The speed and direction of a vehicle relative to the listener can have a dramatic impact on the way it sounds. We are accustomed to hearing a sliding down effect as a fast moving vehicle passes by. However, if the vehicle were coming directly toward us and passed through our body, there would be no shift at all but instead an abrupt jump from high to low. The sliding pitch in the Doppler effect is a function of the angle between the observer's line of sight and the path of the vehicle. The Doppler shift curve should be more sudden for close passes and very gradual for vehicles passing at a distance.

 

A United States Navy F/A-18E/F Super Hornet in "transonic" flight, meaning that it is changing from subsonic to supersonic speed. The front of the aircraft has broken the sound barrier, which has caused water molecules in the air to condense in the low pressure area behind it and form a cloud. This phenomena enables one to "see" the sound barrier.

An interesting use of the Doppler effect can be heard in the opening scene for Apocalypse Now, a scene that is often referred to as "The Ghost Helicopter Flyover". A synthesized beating sound sweeps slowly from the back of the theatre to the front in Dolby 5.1 surround. A Helicopter suddenly passes left to right on the screen and at that moment, the pitch of the sound drops. The effect is totally unrealistic and entirely synthesized (it's a Moog, in fact), but it is also one of the most memorable sound effects that came out of 70's cinema and it truly works. This shows that it is possible to fake the Doppler effect electronically and still create a great sound effect.

Vehicles moving at supersonic speeds may actually pass by the listener in complete silence. This is because the aircraft has exceeded the speed its own sound. When the sound is finally heard, there is an explosive crack called a Sonic Boom. All the wavefronts that would normally be radiating from the front of the aircraft are piled on top of each other and hit the listener at the same time like a wall.

If an approaching aircraft moving slower than the speed of sound broadcast two notes, A and B, an observer standing on the ground would hear the note A followed by B. If the aircraft is approaching faster than the speed of sound and broadcast the same two notes, the observer could actually hear the B note followed by the A note. That is because the B note is broadcast closer to the observer than the A note. The sound heard after the sonic boom of a supersonic aircraft is actually perceived by the listener in reverse order. When I first considered the possibility of this, I had a hard time understanding the logic. Just imagine a stone skipping across the water. Each time the stone hits the surface of the water; a circle of water ripples is created. The ripples caused by the skipping stone will not reach any particular point in the water in the same order that they were created. Ripples from the stone's last skip could be felt first, followed by ripples from the second to last skip, and so on, until all the ripples are felt in reverse order. Sound works the same way!

Compiling a Vehicle Sound Library

Now I will discuss how these sounds can be prepared for interactive applications. Sounds for vehicles can easily be divided up into a multi-layered sound set that directly correlates with the functionality of a vehicle. A steam locomotive could have individual sounds associated with the engine, exhaust, rolling wheels, creaking metal joints, etc. All of these little elements can be positioned in three dimensional space, and exist as an interactive environment of sound. Together, these sounds could be organized into a "sound library" for the locomotive.

"Library" is a pretty broad term that sound artisans use to describe a set of themed sounds. A library is basically an assortment of sounds that in some way relate to one another. The relationship may be as simple as parity of tone, such as "bad weather sounds", or "footsteps on linoleum." In this case, organizing the sounds into a library facilitates the searching and finding process.

If the library is based around the actions of a specific object, the sounds may actually be dependent on one another. For example, with propulsion system sounds a military tank library may be organized according to a series of states such as "start - idle - accelerate - cruise - decelerate - stop." These sounds need each other to be useful individually. A military tank sound library might feature an assortment of sounds meant to be played in parallel with propulsion sounds such as mechanical squeaks, weapon releases, terrain sounds, hatches opening/closing, etc. This type of library has an elegant structure for interactive applications and may even be a useful organizational format for film sound design.

Once the library for a given object becomes very complete, it could be organized into a 3 dimensional structure, meaning that the sounds in the library would have a positional relationship (this is what the vehicle sounds like in front, and like this in the back, like this for interior, etc.) A 3 dimensional library is especially useful for large vehicles such as space stations so that the observer can "feel" where they are in relation to the propulsion system, landing gear, control room, outer walls etc.

An even more exhaustive sound library might combine several organizational relationships. Consider, for example, a vehicle that is zooming by closely, accelerating to high speed and firing a weapon at the same time. It would be more realistic to apply motion effects to the propulsion sound and the weapon rather than just the propulsion sound. Therefore, the weapon sounds should exist as part of the library so that the propulsion sound state and the weapon shots are processed at the same time. If a sound library is not organized according to relationships, it may just be a folder full of somewhat similar sounds that are layered together in some way to hopefully make the sound for all required actions. This free-form organic approach is often the way things go with sound design. I admit that this works for me sometimes, especially for film. However, the development of a more structured sound library will provide the flexibility and completeness required for today's interactive media. Perhaps more importantly, it is just inspiring to build sound libraries that have an interactive structure. I encourage people to try this on vehicle sounds.

Sound effects editor David Lewis Yewdall refers to this concept as "total immersion entertainment", where linear soundtracks are divided up into a "bundle of audio-event files" with interactive layers and triggers that are "linked to precise assignments of action and reaction." In his book "Practical Art of Motion Picture Sound" he describes how one could break down the sound of getting into a car into singular events:

...The reality of getting into and driving a car requires a multitude of audio-event cues, each happening at precise assigned cause-and-effect points of action - either end-to-end, overlapping, cyclical or single performance, perhaps repeated action with the same audio file assignment fired as a random-access variation to thrwart carbon-copy syndrome. You open the door which triggers CAR DOOR LATCH OPEN. This door latch is only half of the cure. The second half will not play until the operator pulls the door open and the mechanism passes the latch plate...When the door is opened to its widest point, a third audio cue, the CAR DOOR HINGE BOUNCE, will activate. This brief but extremely real and satisfying sound sells the heft and reality of the door as it has reached the limit of the hinge swinging open...Now the software reads the speed at which the operator closes the door and it chooses from among three to five variations of door-closings, depending on the close velocity.

Yewdall goes on to describe the different audio cues associated with getting into the car, putting the key into the ignition, turning it, etc. Basically any series of actions can be broken down into an interactive library like this. At any time, the library can be augmented with additional variants and actions in order to cover all scenarios. The creation of re-workable non-linear sound libraries is becoming a more important skill to learn in the midst of emerging interactive technologies.

The pod racer engine sounds in The Phantom Menace were designed with recordings of race cars, rockets, motorcycles, shavers, electric tooth brushes, helicopters, warbirds and boats.

 

Takeoff

As audiences have come to expect more, the quality of sound design must rapidly turn more impressive. Audio designers have no choice but to keep up with this ramping intensity. The most effective solution is not to plainly elevate energy and loudness but instead deepen the experiential complexity of sound. This requires out-of-the-box thinking about how sound effects are designed, in what way they relate to scenes and depicted objects, and how they function together as part of a full-scale soundtrack or interactive world. I hope that the concepts and experiences in this article will provide readers with the necessary ammunition to construct more arresting vehicle sounds for their next project.

References:

Aadahl, Erik. "Transformers: Part 3." Filmsound Daily . n. pag. Online. Internet. 4 Jul. 2007. Available: http://filmsounddaily.blogspot.com

Blondin, Darren. "Recording Underwater Ambiences." Projects & Research in Sound Design . Par 2. Online. Internet. 15 Oct. 2007. Available: www.dblondin.com

Burtt, Ben. "Ben Burtt answers questions about sound design of Star Wars." n. pag. Online. Internet. Available: http://filmsound.org

Davis, Dane. "The Matrix: Young Guns, New Tricks." Studio Sound . n. pag. Online. Internet. April. 1998. Available: http://filmsound.org

Filippone, A. Advanced topics in Aerodynamics: Aerodynamic Noise. n. pag. Online. Internet. Available: http://aerodyn.org/Acoustics/Sound/sound.html

Framer, Michael. "The Making of a Soundtrack" n. pag. Online. Internet. May. 2005. Available: www.ultimateavmag.com

Gershin, Scott. "Chronicling Riddick: Simultaneous Sound Design for Film, Games, Anime." Mix . n. pag. Online. Internet. 1 Aug. 2004. Available: www.mixonline.com

Gershin, Scott. "It's Only Make Believe." Electronic Musician. p. 3 Online. Internet. 1 Jun. 2007. Available: www.emusician.com

Lievsay, Skip. "Star Wars: a new sound." n. pag. Online. Internet. May. 1997. Available: http://lavender.fortunecity.com

Serafine, Frank. Interview. LoBrutto, Vincent. Sound-on-Film: Interviews with Creators of Film Sound . New York: Praeger, 1994.

Warmbrodt, William. "Helicopter Noise." Pulse of the Planet . N. pag. Online. Internet. Available: www.pulseplanet.com

Whittington, William. Sound Design & Science Fiction . Austin: University of Texas Press, 2007.

Yewdall, David Lewis. Practical Art of Motion Picture Sound . New York: Focal Press, 2003

 

 

[back]

 

 
Darren Blondin, 2010