Limited Time: Top 20 Plugins Sale | Shop Now »

Game Sound Design Tips from Dwight Okahara

Oct 19, 2017

The Audio Lead at L.A.’s Insomniac Games and sound designer for Sunset Overdrive and Spider-Man shares his work methods and favorite tools and techniques.

In the Studio with Mark Ronson

“As a sound designer I want to create interesting sounds that will stick in people’s heads,” says Dwight Okahara, Audio Lead for Los Angeles’ Insomniac Games and a major contributor to the audio in hit titles such as Sunset Overdrive, Ratchet & Clank and the upcoming Marvel’s Spider-Man. We sat down with Dwight to find out what it takes to create iconic sounds for the world’s most popular games.

Dwight, what are the main challenges when working on sound for games, as opposed to movies or television?

In two words: real time. The most exciting part of working on dynamic media versus linear media such as TV or movies is that you have to think of audio in real time. Let’s say that the player is firing a weapon. We would create a great, dynamic sound effect for that weapon. In a linear world, you’d know the exact use for that weapon SFX and be done. But for game audio, you don’t know what kind of situation the gun will be used in. At any given time the sound effect could be fired in an alley, a cavernous space, or even underwater. The sound could also be played thousands of times in a ten-hour game experience. You need to account for randomness and variety for that sound effect, so it doesn’t become stale and remains iconic and fun to hear.

How do you start working on audio for a new game? Can you take us through the creative process, from research to implementation?

All high-budget, high-profile games – what we call AAA games – go through extensive pre-production, where a representative from every discipline – design, programming, writing, art, audio – come together to define the style of the game and universe they’ll be making.

Throughout pre-production, the core audio team – sound designers, dialogue specialists, music specialists – will create prototypes to define the feel for the game’s environment, weapons, heroes, etc. We create what we call ‘soundscapes,’ where we ‘paint’ an audio-only picture. We play this back for the pre-production group and they get a good idea of how we want the city to live and breathe, how we want the hero to traverse and explore the world, and how music and dialogue will create the emotional backbone for the game’s universe.

Is all the audio made in advance, or is there some processing done in real time inside the game engine?

To create a sound effect, we’ll start with the source sounds we want to work with – they can come from personal SFX libraries, commercial SFX libraries, or field recordings – and we’ll design the sound using multiple plugins. Once we’ve created a sound effect, we’ll bounce that element out as a .WAV file and implement that into our sound engine. We use Wwise in our games, and once a sound is implemented, the real-timeness of the game takes over. When that sound is played back in the game, it hits any appropriate reverb, EQ or occlusion/obstruction filters that we have created, so that the sound plays back as expected in the environment whenever the player is triggering it.

For example, imagine an enemy firing a gun at you from an alley. You, the player, would need to realize where that sound is coming from, so the sound would have to be processed in real time taking into account the reverb type, material types (varying degrees of absorption) and fall-off distance that we assign to that source sound effect.

So, some of the work takes place in a DAW (I personally use Pro Tools), while other is processed in the game engine. I’d say that 40% of a sound is creative and the other 60% is implementing that sound so it sounds as intended.

Which processing tools do you find yourself coming back to in your workflow, and why?

I have an analogy that creating a sound effect can be very similar to another passion of mine, cooking. I look at the various sound elements as my ingredients and the plugins as my herbs and spices.

All the members of our core audio team all use, at a minimum, the plugins in the Waves Gold and Platinum bundles. I really like the way the H-Series plugins sound too. All four of them – the compressor, EQ, reverb and delay – have a very musical feel to them and give a nice, velvety sheen to my sound.

When working with very impactful sounds like explosions and gunfire, I like running them through the J37 Tape emulator. I think our ears respond in a positive manner to pleasant distortion, and this is the plugin that can make those “ear-hurty” sounds mellow out and make them feel a little more gratifying.

A good compressor is almost always there, too. The Linear Phase Multiband Compressor is a workhorse. It stays out of the way by letting me surgically approach a sound effect. That sound effect might span across quite a few frequencies, and the Linear Phase compressor helps you retain the parts of the sound that grabbed your interest in the first place.

MaxxBass is also a go-to when you know that a lot of players in your audience don’t have access to a subwoofer in their home setup. MaxxBass can pull out frequencies that make impact sounds like explosions and gunfire sound really big. It’s still relevant today because you can use it on some of your LFE discreet sounds.

Another plugin I appreciate when dealing with impactful sounds like explosions is Smack Attack. An explosion sound can be a very complex thing, and Smack Attack helps me shape the way a particular element stands out in the overall explosion effect.

There are two other Waves plugins that fall into the ‘salt-and-pepper’ category, meaning I use them quite a bit to bring out small but noticeable results: Vitamin Sonic Enhancer and Infected Mushroom Pusher. Vitamin gives me that brilliance you might have lost along the way when using multiple plugins, so I save it for the end of creating a sound effect. A little goes a long way here and I really use it gently. As for IM Pusher, it can help me showcase certain subtle frequencies from a sound effect I’ve created – not necessarily the higher frequencies that stand out anyway – and make them pop out.

What about synths? How do you use them in your process?

Virtual instruments definitely have their place in my ‘kitchen.’ When designing UI interfaces for games, it’s important to have tools that can let you create small, precise sound effects that are still interesting and tactile to use. The Element and Codex synths are both easy and quick to use so I can dial up that button click, slider notch sound, confirmation beep, etc.

And how do you design the audio to create a sense of space in a game?

Most of our sense of space comes from the environmental, real-time reverb and delay that we add to various parts of the game. We try to keep a lot of the sound effects dry, as we don’t know which exact situation they would be used in.

Although we have Doppler built into our sound engine, sometimes I like using the Waves Doppler plugin, especially when working on cinematics where you might need a helicopter or jet flying by the listener. Doppler gives you immediate results very quickly and delivers the sound you’d expect in the real world. When working in post, you’ll find that you might need to go back and revisit some hard sound effects that you had timed out perfectly at one time due to changes in animation. The Doppler plugin makes it really easy to go back and re-sync sounds like that.

All these tools – and also UltraPitch, the C6 multiband compressor, almost every Waves EQ – are really common tools for me that I use as much as I use a fork, knife, and spoon at a meal. They’re totally necessary in my day-to-day work and the crafting of audio for games.

How do you control any dialogue you’ve got the game? What are the main challenges there?

A main concern with dialogue is mic proximity. A lot of times, dialogue can be recorded before visuals have been created, or the dialogue director might have a different idea of how the line would be delivered. After all the visuals have been created in post we might get a performance from a line of dialogue that needs to sound as if it were recorded further away from the mic, like they were talking to someone down the street – or vice versa, we might need a line of dialogue to sound more intimate and close to the player.

In these cases we would try and simulate the mic proximity a little better by using several plugins. My go-to’s for this purpose are the MV2 compressor and the L3-LL Multimaximizer. I find the MV2 especially powerful because I can grab my source and with the Low and High Levels and control that sense of mic proximity, which makes the sounds I get really versatile.

But to be honest, it’s always better to get the performance correct in the recording studio, and we’ll end up bringing an actor back in for pickups at a later date so we can replace those early takes.

How do you make sure that the sound effects and the music score are not competing with one another?

A common thing for us to do is to grab video captures of the game after we have a substantial amount of the sound design implemented and pre-mixed. We give that video capture to the composer and they can work with the instrumentation and the mix to minimize cross-frequency contamination.

Do you find that the audio aspect of games has more weight now than it did 5 or 10 years ago?

Definitely, audio seems to matter more to the Creative Director level of the team now than ever before. Audio is recognized as a tool that can really convey emotion to the player. Sound in cinematics is incredibly powerful and using it correctly in the storytelling aspects of the game is so important. I’m so happy that we’ve come to a point where we can use surround sound and LFE channels to deliver a full spectrum audio experience to the player.

What about the use of VR and immersive 3D audio technologies? How has that affected the weight given to audio?

VR is definitely an exciting frontier for sound designers in our industry. In many cases I’m finding less is more in the approach to designing audio for VR, because oftentimes you don’t want to cloud the player’s focus of what you specifically want them to see in the game.

I have certain tools I use to force certain sounds to be more prominent in VR. Believe it or not, some of the most useful plugins you can use to emphasize and localize specific sounds are EQs, and I lean quite heavily on the Q10, F6 dynamic EQ, and H-EQ plugins when designing sounds that I need to localize to the player.

By the way, are you a gamer yourself? Do you play the games you develop, or is it only during production?

I serve on a panel where every year a group of industry vets get together and play the games that have come out for the year and we rate the quality of the sound design for those games. That process keeps me sharp and ‘in the know’ of what other games are doing.

I also have an exercise I started at work that I call C.A.R.S. (Competitor Audio Review Session) where we’ll play as a group and focus on game audio that we respect or have heard great things about. So yes, I am still a gamer and play as much as I can. Even though we put thousands of hours into every game we’re making, there are still a few where I like to open the finished package, plop down on the couch, and play the game for real.

Who inspired you when you were making your first steps?

Donkey Kong! Seriously. Because I started so early in the game industry, there wasn’t a lot to compare against. So really, my biggest inspiration was playing hours upon hours of Pac-Man and Donkey Kong. I’m certain that I gleaned valuable information from them, like how to make an iconic sound.

Finally, what are your most important tips for novice sound designers?

Be flexible. Remember that you’re working with creative assets and that they are very subjective. Work with members of the team from every discipline, so that all the team members work cohesively to make a great-sounding, looking and functioning asset.

Another tip would be not to work for free unless it’s on a passion project. Otherwise, you’ll find that people will take advantage of you. Your time and creative input are valuable, or they wouldn’t be looking to you for something in the first place.

Lastly, don’t expect to start creating iconic award-winning assets from the very start. You’ll learn important lessons and tricks along the way. File those away in your ‘recipe card index,’ and one day the hard work will pay off.

Thanks, Dwight!

Discover more sound design videos, interviews and tips on the Waves blog.

Loading....