Abstract
Taking inspiration from the Little Big Planet 3's generative audio capabilities, I utilized Unreal Engine 5’s MetaSounds to explore the possibilities of procedural music design. MetaSounds and Little Big Planet offer a robust node-based development interface, providing a less intimidating introduction for new users and encouraging experimentation during creative systems development. This post documents the process of learning and designing my own procedural music system in MetaSounds, to demonstrate the tool’s effectiveness and capabilities as a performative musical instrument.
Overview
MetaSounds is described by Epic as, “a high-performance audio system that provides audio designers with complete control over a Digital Signal Processing (DSP) graph for the generation of sound sources” (Unreal Engine 5.3 Documentation). Every MetaSound functions as an independent audio engine, and its function as a DSP graph - over something akin to the more common Digital Audio Workstation - allows audio to be generated procedurally in runtime with other sound sources (Unreal Engine 5.3 Documentation). So theoretically, multiple MetaSounds for different parts of an instrument could act simultaneously within the game engine. Another key strength of MetaSounds is the self-referentiality, where graphs can be directly inside other graphs as variables, opening the door to infinitely complex designs (Unreal Engine 5.3 Documentation). The flexibility and complexity of MetaSounds alone proves its validity as a compositional tool, beyond that of a rendering engine.
Method
With all these factors in mind, I installed Unreal Engine 5 ready to begin my exploration. Before jumping immediately into what would become the final product I am demonstrating in this paper, I wanted to gain acclimated to navigating and working within the engine.
Tutorials
The main tutorial I relied on in gaining fundamental knowledge on MetaSounds came from Epic Games’ own community forum. Matt Spendlove’s “MetaSounds” seems to be one of the earliest resources for the system. He even has a recorded lecture from GameSoundCon 2022 posted on the official Unreal Engine Youtube channel, where he presents much of the information found in the community post.
The post is extremely comprehensive in easing new users into MetaSounds, with sectioned steps introducing new concepts and a github repository that includes the same session used by Sam in the video series. Out of all the tutorial sessions, the two most helpful to the creation of my final product were the course on subtractive synthesis and building a monosynth.
MetaSound Subtractive Synthesis
Only having experience with additive synthesis when designing sounds in my DAW, getting a grasp on subtractive synthesis techniques required an entirely new form of thinking. I was relying more on my oscilloscope plugin MiniMeters than my ears when trying to refine my sound to that of the video, with much more trial and error than focused direction. Figuring out how to manipulate the white noise into a more listenable signal was a challenge, and this exercise better equipped me with filter approaches and high level understanding of how to design synths in this manner.
Building a MetaSound Monosynth
Without this tutorial I would have been unable to complete my final project. My familiarity with synthesizers exists largely on a soft synth level, using additive synthesis based on editing set parameters hardcoded into the VST I’m using. Thus, needing to isolate the elements and construct the synth engine from scratch was perhaps one of the most engaging and enthralling parts of learning MetaSounds. Translating what I knew of signal processing into the node based system allowed me to visually track how I was constructing the modules, nearly giving a tactile realness to the concepts introduced. Upon completing these two tutorials, I was feeling extremely confident and excited about starting the full project - but perhaps naively so.
Opening Fresh
I felt ready, I felt confident, I made a new Unreal project titled “Game_Device_Final” and then launched a new MetaSound source. And then I stared at it.
Fresh MetaSounds File
The emptiness of the project struck me. Without realizing I had fallen into tutorial syndrome, where the comfort and guide of following instruction gave me a false sense of mastery. And here instead I was, paralyzed at the potential possibilities this project could take hold. My original sense of direction was vague in itself: create a generative music loop. But in that broad scope I didn't consider the intricacies required to get the instruments, tempo, percussion, and overall musical feeling from a series of node arrays. Nonetheless, I figured the best decision was to start with what I gained from the tutorials, and build the system through iteration.
The First Design
After about an hour of experimentation, I arrived at what was the earliest stages of my vision for a generative music system. I built a square-wave monosynth, and in an attempt to add a pulsating rhythm to the tone, attached a trigger-based repetition sequence. However, I was still very much uncomfortable with nodes not immediately explained in the video tutorial, thus my understanding of how to properly take advantage of some blocks was limited. In the video, you can see node designs that aren’t functioning, or are entirely detached out of frustration. Caution when playing this video! The sound was not going through any processing and is very loud.
MetaSound Milestone 1
At this stage I was quite unsatisfied with my results. The progress I thought I was making after finishing the tutorial felt undone, and I was realizing the breadth of the task that lay before me. But I didn’t want to sit on the discouragement for too long. It wouldn’t have been fair to compare my first design in MetaSounds to the examples that inspired me, nor give up on the project considering everything was due in the next week.
Building Again
So I started over. The CPU usage on each individual MetaSound is low, and it is very common to have lots in a singular project. After about a thirty minute break, while still preserving the original attempt, I created a new MetaSound source to begin again. This system grew far more complicated than the previous, as during the break I referenced the Unreal documentation to read more about the nodes that weren’t mentioned in the tutorial series. Through this reference, I felt more confident about using the programming-based nodes, and was able to create separate structures that could set a tempo, and directly communicate with the synth engines. [The audio in videos from this point forward is normalized]
MetaSound Milestone 2
To explain this design, I have 3 core building blocks that generate the looping patterns. The top left is what became my base building block for all future tempo systems. On play, a trigger repeater node would send a signal to a counter based on a set tempo.
Tempo Building Block
This system fed into a sine-wave based synth that arpeggiated based on random scale degrees pulled from an array. The bass/lower tone functioned identically, save for the blended use of triangle and sine waves for the sound, and omitted trigger repeater to have the note sustain in comparison to the arpeggiated figure. When reaching the recorded stage in the design, I was feeling better about my progress, but still felt unsatisfied with the current result. But after a 3 hour session of listening to raw sine waves, I decided to pause progress and take a longer break.
A Second Pass
That time away was the breakthrough I needed. From that point I made a 5 hour sprint on another entirely new MetaSound source. I took the modular structure from the tempo block and used it as the template for forming the main elements of each system.
Arpeggiator System Block
Further reference of the Unreal documentation gave me more ideas for organization of the project, allowing me to arrive at the pictured design after lots of trial and error based tweaking. I was able to generate a better sequence of harmonic tones through utilizing the “Scale to Note Array” node. Selecting the “Minor Pentatonic” scale system as a base, the tones randomly arpeggiated in time after attaching the Melody Generation block to the Tempo block. Rather than rely solely on one synth engine, I created both a sine and saw wave that the scale system fed into, to give more potential variation during playback. The sound sources can be via an LFO, but I made the decision to assign the parameter to a knob. This gives freedom and agency during performance, allowing for active crossfading while showcasing the full MetaSound.
Bass System Block
Next was the bass tone system. Being entirely honest, I copied what I made for the arpeggiator and made adjustments to better generate the longer, held tones I was envisioning. Major changes included: basing the synth engines on saw and triangle waves, adding distortion through a bitcrusher node, and reducing the BPM repeater frequency to sustain the entire whole note. The note generative system was still randomized to the minor pentatonic scale. Upon routing both systems to and hearing the playback, I felt satisfied with my progress. What was a 9 hour sprint resulted in tangible progress towards my initial vision. I felt more than eager to return to the design the following day.
MetaSound Milestone 3
Day 1 Finished MetaSound
Seeing it Through
The next phase of design was to construct a system that would generate the chords to go along with the bass. This aspect of the design proved to be the most difficult overall, as finding a structure that would avoid repeated clashes with generated bass tones took refactoring the bass system as well. As for the synth design, I combined 4 different saw-wave generators, with the subsequent 3 detuned to scale degrees that synthesize a 7th chord.
Chord Synth Engine
This part of the design drew on my music theory knowledge, as there were no note names I could reference (C, D, E, etc.). Instead, from what I learned of pitch classes in serialism, found I could assign values to the notes: C=0, C#=1, D=2, etc. I built an array for the scale degree generator to reference, and adjusted the weights so I and V chords would be more common and provide a better sense of resolution to any random progressions the system generated. I found it necessary to create separate arrays for the chords and bass, with varying weights to reduce the amount of harmonic clashes, though some clashes are statistically inevitable.
Chord Notes Chord Weights
Bass Notes Bass Weights
Arriving at these points in each array took lots of listening and tweaking, as some sonorites in the bass would work occasionally, but had to be completely avoided in the overlying harmony. See the discrepancy in seven potential harmonic basses for the chords to eight in the bass. Once I arrived in a place where I was comfortable with the progressions being generated, I moved on to the final stage of production: percussion. For the bulk of this system, I followed Dan Reynold’s example video on designing a kick drum, tweaking the base structure to implement a sidechaining system and increasing the trigger rate to elevate the overall energy of the music. At this stage, I felt as though I could turn in the project as is, but still knew I could add an extra element to push my abilities.
MetaSound Milestone 4
Results
Below is my final, completed MetaSound. The final elements I wanted to introduce were a snare drum and hi-hat that would fill out my composition. My hi-hat system grew out of the subtractive synthesis tutorial graph I had previously crafted, with an added repeater to generate the rhythm. To avoid some of the listening fatigue, I introduced a panner automated by an LFO which added variety to the repeated figure. The snare drum is a frankenstein of the subtractive synthesis engine from the hi-hat combined with the transient shaped kickdrum, resulting in an atonal splash every third beat. And finally, I reduced the frequency of kick drum hits to add a little more clarity to the entire composition. The resulting system successfully generates a randomly generating musical piece, with continuous amounts of variation in its progression.
Final MetaSound Demo
Full MetaSound Project
Reflections
I am extremely satisfied with my completed MetaSound. For my first foray into the software, I was able to construct a fairly complex design, which does what I envisioned in generating a procedural composition. Were there more time to continue the development of this project, I would experiment more with automation and programming in the larger Blueprints system. Assigning specific elements to trigger on entering/exiting different areas in a level, or based on velocity parameters of the player character can showcase further performability of MetaSounds as an instrument outside the modularity of the source itself. Following the conclusion of this project, I intend to continue experimenting with MetaSounds. I have found a new medium for designing audio processing systems, and look forward to mastering Unreal Engine’s technological innovations on my personal journey to create immersive and dynamic soundscapes in the realm of game audio.
Resources
Epic Games. “Metasounds: The next Generation Sound Sources in Unreal Engine.” Unreal Engine 5.3 Documentation, 6 Sept. 2023, docs.unrealengine.com/5.3/en-US/metasounds-the-next-generation-sound-sources-in-unreal-engine/.
Lantz, Anna. “Music Nodes in Metasounds: Tutorial.” Epic Developer Community, 5 May 2022, dev.epicgames.com/community/learning/tutorials/qzE5/unreal-engine-music-nodes-in-metasounds.
Reynolds, Dan. “Ue5 EarlyAccess Metasounds Making a Synth Kick Drum.” YouTube, 27 May 2021, youtu.be/vcTqZPDR0Yk?si=RjTmZxYVo9qcZvWd.
Spendlove, Matt. “Metasounds: Recommended Community Tutorial.” Epic Developer Community, 26 Jan. 2023, dev.epicgames.com/community/learning/recommended-community-tutorial/Kw7l/unreal-engine-metasounds.
Comments