So you've written and produced your music, recorded your dialogue and created all your sfx - but how do you get these files in your game and make them behave the way you want?In game audio, creating the audio files themselves (sfx, music, etc) is a process but so is creating the systems that govern how and when these sounds are played. Instead of putting audio on a timeline, game audio professionals are usually asked to design audio events which are then programmed by either a technical sound designer or programmer on the development team to work with Unreal, Unity or another game engine. Wwise is one of the leading applications of audio middleware used to design these audio events. Able to be integrated with Unreal, Unity and most other available game engines, it gives the user the ability to implement all audio in a game space. Students will learn the basics of working within Wwise, general considerations for implementing audio and a musical score as well as the available tools for troubleshooting. During the course, students will implement all audio for the open source game Cube while walking through the educational resources created by the developer of Wwise, Audiokinetic. This class is designed for sound designers, composers, developers, and anyone else experienced in sound who wants a more holistic game audio skill set.