Project Flight:​

A Case Study of Terrain Generation, Audio Analysis, and Multithreading​

built out of love, frustration, and the desire to create something that f*ing works.

Project Overview

The goal of Quest of Honor was to create a tabletop game blending strategy, action, and role-playing. I focused on developing an engaging gameplay experience through concise rules, intuitive visual design, and clear player interactions. The aim was to immerse players in a medieval fantasy world where duels and tactical decisions drive the narrative.

Project Owner

Myself

Role

Game Designer & Content Developer

Timeline

October 2017 – Present

Development Platform

Tabletop

Skills Used

Game Design, UX/UI for Game Components, Instructional Content Creation, Graphic Design & Layout, Copywriting & Editing

Tools

Adobe Photoshop, Adobe Illustrator, Adobe InDesign, Google Sheets, Google Docs, Trello, Notion, Unity, Autodesk Maya.

Challenges

1. Complex Mechanics, Simple Rules

The core challenge was making complex gameplay mechanics easy to learn and fun to play, ensuring that even new players could quickly grasp the rules without sacrificing depth for experienced players.

2. Creating Intuitive Visuals

Designing reference cards that quickly communicated essential information while reducing visual clutter was a key priority. Balancing aesthetics and functionality was crucial to enhance gameplay without overwhelming players.

3. Maintaining Thematic Consistency

Ensuring that all design elements (cards, tokens, icons) fit within the medieval fantasy theme while staying cohesive across game components.

Project Flight:

A Case Study of Terrain Generation, Audio Analysis, and Multithreading

A project built out of love, frustration, and the desire to create something that f*cking works.

1. Project Overview

A CASE STUDY OF TERRAIN GENERATION, AUDIO ANALYSIS, and MULTITHREADING.

What started as just the first two ended up inevitably including multithreading (because it seemed completely inescapable to create these concepts without multithreading).
So, why Terrain Generation, Audio Analysis, and Multithreading?
The game in development that showcases these three concepts is deceptively simple-

Input a song or audio file of your choice, allow the program to process the audio, and from your audio file create a map to fly a paper plane through, creating a truly immersive and customizable experience.

Using keyboard controls or mobile gyroscope to tilt, fly through a procedurally generated cave or tunnel created from your music, witnessed in real time as your song plays.

Some of the initial questions after starting this project were:

  • What would be an engaging and consistent way to ensure that whatever audio file is put in, would output something interesting in the terrain generation?
  • How could I ensure that the player experienced listening to the music at the same time as they heard the music?
  • How can I show a vast difference between song types (rock vs smooth jazz) in terrain generation while remaining engaging to play through?

2. Exploration & Discovery​

In the original design, the terrain was not a tunnel or a cave, but an endless landscape. At the time, I had been working on an ‘endless runner’ style of game that would involve flying through different styles of terrain. The idea was heavily inspired by the “How to Train Your Dragon” series flying concepts.  While the project was still in the prototyping phase, I realized the core premise of the game idea was at odds with [su_tooltip style=”dark” content=”Cheers”] my perspective on game design. [/su_tooltip] With the way I envisioned the design of some of the styles of terrain, I would have needed to implement some form of invisible wall to keep players from just flying above the map and avoiding all the obstacles. Not only could they avoid the obstacles, but they could fly beyond the edge of where I had designed. This is what first led me towards procedurally generated terrain.

In an attempt to avoid this issue, I began testing realtime procedural generation by following along with a coding series by Sebastian Lague. I quickly developed a prototype of the level, which, while working perfectly fine on pc, was consistently jittery or frozen when testing on mobile. I continued to follow along with the series and saw Sebastian was encountering a similar problem, and was looking to implement threading to improve performance. I did some research of my own and after seeing the difficulty he was having, felt that this was outside my current skill level.

In an effort to not give up, I looked into compute shader, as I had seen Sebastian use this in a previous terrain series to improve performance. While I feel this may have worked for this project, after exploring the idea, felt my effort was better spent on improving my knowledge of C# first. After running into so many obstacles on the project, I decided to put it on the back burner for a bit, while I determined where I wanted to go next.

It was around this time when I attended the Boston Festival of Indie Games, featuring the r.e.B.E.R.t.h. Prototype by ‘SonicBloom’, a horizontally scrolling shooter in which enemies attacked according to the beat of the music.. ‘SonicBloom’ had created a tool called ‘Koreographer’ which they were showcasing at BostonFig. ‘Koreographer’ allows you synchronize your gameplay to music by creating events along the length of an audio wavelength, displayed in their custom editor tools. Seeing their booth reminded me of a long forgotten desire to make a rhythm game.

I started trying to find unique rhythm games that might give me further inspiration. A modern take on rhythm games that has become very successful is ’Beat Saber’, which combines VR ‘Fruit ninja’ with beat matching. Around this time I met the indie developer behind the game ‘Thumper’, at an IGDA meeting. ‘Thumper’ is a “rhythm violence” game which I see as an evolution of ‘Audiosurf’ but with controls similar to an endless runner. One of the key differences in this game is that you don’t play along to music as you are creating the beat as you try to match the rhythm of the level. Another unique game I found during this research was ‘Crypt of the Necrodancer’, a rhythm-based dungeon crawler where you move and attack to the beat. I liked that all these games brought a fresh perspective to the rhythm genre.

Importantly, no games were found that did terrain generation from music input, most that came close were basically ‘AudioSurf’. And, even so, no beat matching games were found to be flight-based. The closest I have been able to find would be Aaero, which is more like sliding along the inside of a cylinder than flying based on the controls.

That’s when it hit me. I could use the assets from my endless runner to develop a new and unique way to experience your favorite music. A beat-matching style game, but instead of hitting beats your aim is to avoid obstacles generated from the unique waveform from each song. Once I had the idea to make terrain out of the sound waves, it became a matter of figuring out how to accomplish that. I realized that I could do this by modifying the terrain generator to create a tunnel instead of an exterior landscape, and it should be more mobile friendly than my previous idea.

3. Game Design Process

I started my development by going back to the spark that reignited this project, ‘Koreographer’. It was not long before I realized that due to the nature of the plugin, it would not allow for the type of audio synchronization I was looking for. I knew I wanted to find a way to truly analyze an audio file. I figured if I could find a way to recreate the typical audio visualizer, I would figure out where to go from there. 

I began to ask “how do I analyze audio frequencies?” Research began on audio spectrum analysis, audio visualizers, and anything audio related I could imagine. After expanding my understanding of the way audio frequencies work, the different wavelengths, and how audio files are stored, I started searching for resources to bring these concepts into Unity. This is when I found a tutorial on Algorithmic Beat Mapping in Unity, that allowed for a simple audio visualizer. What the repository has is a script that analyzes the audio samples and outputs a spectral flux for each sample, and then compares samples in a range, to determine whether or not something is a peak frequency or a peak sample. The visualizer then displays the samples on screen as the song plays, as depicted through a live audio visualizer. The repository only allows the totality of the frequencies of the song to be easily accessed, so I then split audio frequencies by different bit rates to get the 8 frequency ranges.

More research was required before I could accomplish converting these values into terrain. First I needed to create the coordinate plane that would allow the terrain to wrap 360 degrees to connect back to itself. This was not too difficult to understand but took some time to work out. Next I needed to test that the terrain created reflected the audio that it was based on. In order to do this, it is crucial that the player is flying down the tunnel at the exact same rate that the song is playing. This is essential in order to maintain immersion. When the music is off by any amount, it feels like the equivalent to your game lagging.

After getting a rough prototype working, I transferred my build to my phone and rushed off to the gym to show my friend the progress I was making. His band had just recorded a new song and I had turned it into a level. My enthusiasm faded as what took moments on my pc, took agonizing minutes on my mobile. I had dealt with long load times for games before, but waiting 7-10 minutes to load a 4 minute song is something I felt few people would be willing to tolerate. I needed to find a way to improve the load time, or the project was dead.

I had a bit of an older phone, but it still performed just fine with everything else, so I thought it was possible it was a device issue. I borrowed my friends’ top-of-the-line phones, and the load times were only factionally better. If it was not the device, I figured it had to be something in my app that was slowing things down. As I began looking through the file importer, I noticed a section talking about LoadInBackground. As I started to research what this was, I was led back to the notion of threading. By default, the Unity engine is basically single-threaded, and most games are set up for single thread processing. But if new modern phones have multiple cores, maybe I could leverage multithreading to split the file up and load it across multiple cores. Unfortunately not everyone has a top of the line phone. I immediately started to pull up the specs of phone models from that last 5 years and looked to see how many cores they had. Seeing that even my older phones I had lying around had quad cores, I felt confident I wouldn’t be developing something only a fraction of people could use.

There was only one problem.. I still knew nothing about multithreading other than how it works conceptually.

I decided to tackle this the same way I had each time I had been thrown into the deep end before.. Read, read, and read some more, till it finally starts to make some sense. The further development of terrain generation and audio sequencing was paused while I continued to unpack multithreading as a concept, and worked to implement the use of multiple cores. After spending about three weeks or so just reading a book on multithreading and anything else I could get my hands on related to multithreading, I felt confident enough to start giving it a shot.

Before I could implement multithreading, I needed to track down where the delays were coming from. For the performance tests, I tried to use the Unity profiler, but I didn’t feel like it gave me cohesive enough information to determine where the slowdown was, and couldn’t run the profiler on mobile. I ran a bunch of tests and decided to build a scene to do performance tests. In the test scene, I broke the loading and audio analysis into sections, and I could see how long each method took. I would start a timer at the beginning of each function and end the timer at the end of each function. By doing this, I learned which sections were taking the longest and then could refine them. I was able to track down the delays to the loading and the final processing of the analysis.

That led me to creating a producer consumer queue, which allows me to split the song up and process it in different chunks at a time using different cores. Implementing the producer consumer queue reduced the loading time on mobile from 1.5-3 minutes, down from 7-10 minutes. While this load time may still seem quite long, for each song this only needs to be done once, so that the analysis can be done, then the terrain data is stored. On each subsequent play the terrain data is much smaller and easier to load, while the song can be loaded alongside the level as it is being played.

4. Final Design

Early on I felt like I would have to figure out multithreading to some degree, though it took awhile before I resigned myself to the task. The references I had been learning procedural generation from mentioned compute shaders and multithreading, and it seemed it would be necessary to explore them. In order to have things work smoothly and load in a reasonable timeframe, I knew I would have to take on some challenging learning endeavors. 

I had seen compute shaders used in similar terrain assets before and I tried to implement it. I abandoned that endeavor after determining there was currently a large knowledge gap there and I had more pressing problems to solve. In the future, I would like to revisit compute shaders in order to hopefully improve the terrain generation, or to possibly speed up the Fast Fourier Transform, which is currently the slowest part of the analysis.

In the end, multithreading seemed like the better choice as it solved the largest problems at hand, and that I might encounter in the future. The producer consumer queue that I created solved much of the long delays of loading times. I could divide up the song based on the number of cores each device has, splitting the total bytes of the song into that many chunks. I could then process the samples by creating a queue for each of the cores, and then passing each the data along with a marker of which part of the song the sample was from. This allowed me to stitch the data back together in the correct order.

The same Producer Consumer Queue was also used to cut down analysis time, and allow terrain generation to run seamlessly alongside the level playing making build time unnoticeable.

Recently, the challenge I have been working to overcome is refining the shape of the tunnel, and specifically the UVs in relation to shadows. It has become clear that the terrain is created properly, but how the computer thinks the terrain is placed has been leading to areas of the mesh having improper shadows.

At this point Project Flight remains far from perfect or finished. I can recognize that, and still be proud of the exploratory steps that brought me here.

5. Learnings

Principally, how to multithread, why it’s so important, and what a significant impact multithreading can make were some of the key takeaways. Furthermore, it’s become clearer how complex it actually is, how difficult it is to implement properly, and that I barely scratched the surface of what multithreading is capable of with this enterprise. Next time I would not have to reverse engineer things and I would look for ways to start with multithreading. The way Project Flight functions, there’s not a lot of possibility for race conditions (aka two things trying to access the same piece of information at the same time). For many other projects, you would almost certainly want to create locks on certain variables or methods to prevent race conditions. This is due to the fact that everything in Project: Flight that uses multithreading for is after the player has interacted. Since they no longer have control and the player cannot create multiple conflicting events while the system is processing, I can easily segment each request and recompile them later. Normally if you’re looking to change data in a list, but another interaction changes that, how do you guarantee which is happening in the correct order? How would you avoid those things competing for the same information? In Project Flight I have things set up so the data always comes back in the correct order regardless, and doesn’t affect the output of the systems created. This all has to do with the producer consumer queue that was created. Planning ahead in this case allowed me to avoid creating a more complex system when a simpler system worked just as well. I find it is usually best to defer to Occam’s Razor. If I was to just split the data and put it in the queue, I couldn’t ensure that the data would come back in the correct order. Once the data is split, there’s no way to know which queue finished first. You would need more locks and systems in place for things to go back in the wrong order.

In the case of Project Flight, I was able to split the audio frequency into separate wavelengths and process per wavelengths. I never actually split the song up for the processing, but instead just stripping out wavelengths and processing them on separate cores.

But that was just for processing.

For loading the song, the duration of the song is split up by the number of cores available. The starting point of each node is determined by the length of the audio file divided by the number of cores, times the node ID.

About Me

I am a game designer/developer, 3d modeler, carpenter, web developer, who is most passionate about game creation and improving the experience of game play.

Frustration in existing games that inspired me to become a game developer. Playing games and understanding that signifiers (like a net) were in a scene without colliders, lead to many untimely deaths in games. Frustration in those scenes, when you expect one reasonable outcome and are dealt with losing progress from someone not adding a collider made me want to fix it and develop better gameplay experience for myself and others.

Beat Vortex

Beat Vortex is the working title for a rhythm game I began working on in 2018. 
I picked back up work on the project during 2020 to see how much further along I could get. 

After extensive research, I was able to implement multithreading for the song loading and analysis.  
This took load times on mobile from 7-12 minutes depending on the song to 50-90 secs. 

I later use a similar technique to improve the terrain generation, which yielded similar results. 

Tools used: Autodesk Maya, Photoshop, Unity, Visual Studio, Trello. 

Honor Lore and art

During the pandemic, I took the opportunity to freelance and work from home as a way for me to focus on finishing up one of my passion projects.  

One of the first steps along this journey was to develop more of the lore and background of the world. 

Honor Rulebook

As progress on the game began to pick up steam, I shifted focus over to refining the rulebook I had previously thrown together during the original development. 

As part of the process, visual and grammatical improvements were made to improve the flow and clarity of the rules. 

After completing the revisions of the rulebook, the project moved forward into play tests and further refinement. 

Journal App

This past year I started working on developing a productivity app. I felt that the to-do list’s or habit trackers I had tried were very narrowly focused on doing one thing well. This often left users needing multiple apps to handle simple planning and task management for their day to day. This creates a lot of opportunities for confusion,  wasted time, and more pain points than their software solves. 

The goal of the Journal App is to give every day users the same depth and control over planning of their schedules as we have for corporations with software like Jira, Trello, Monday, or Asana. 

This project has been a great test of my UI design and implementation. It has also given me a chance to further improve my workflow for loading/storing user data. 

Tools used:  Photoshop, Trello, Unity, Visual Studio. 

Tulipanov Virtual Gallery

For this project, I was asked by the painter, Igor Tulipanov, to create a virtual gallery space where he could host private exhibitions of his art. 

I used a concept image he gave me to model the exterior off of. I then designed an interior that worked well for that space and fit his unique art style. 

The final goal for the project is to be able to embed the gallery on the homepage of their site. This proved to be the main challenge of this project was setting up the ‘multiplayer’ aspect, due to it being webgl. 

 

Tools used: Autodesk Maya, Photoshop, Unity, Photon 

Pip42 - 1 : Building design

PIP42 – 1 gives players the ability to place down structures and create bases on a foreign planet. To do this required a modular building. 

I started off by building a mood board of space structures that I felt fit the aesthetic I was looking for. 
Since the game is top down, I need to design with the ability for the  roof of all the models to be removable. 

Tools used: Autodesk Maya. PureRef. 

Honor Stats Rebalancing

After the first few play tests, we found the stats for Honor needed to be rebalanced. 
Using google sheets allowed me to visualize the distributions of stats for classes, weapons, and armor. 
I was able to do this using conditional formatting, which I set to shade cell based on their value. 

The original stat calculations took a considerable amount of time to work out. 
Developing a new system cut down the time substantially. 

Tools used: Google Sheets

Honor Card Design

Over a few months, I worked to refine the card designs for my tabletop card game. Here are some iterations of the early stages. 

After not seeing much progress towards a suitable design, I decided to create a mood board to help define what I wanted to create. 

I’ve settled on a simplified look with minimal framing, allowing more room to let the art shine. 

After settling on the new design, I began implementing those concepts across the rest of the card types from the game.

Tools used: Photoshop, Illustrator, InDesign, Pureref

Before

This was a project for a local business. They wanted their site updated to something that was more modern and fresh than their old site. 
I designed and built out the new site remotely, then swapped the site out over night to prevent downtime and users seeing a broken site. 

www.ColorGroup.com

Museum of Bops

Museum of Bops was a freelance animation project for the painter Igor Tulipanov.  

Igor wished to see the characters from his paintings come to life. 

I modeled the character, built out a museum and animated all the characters.

The video is the intro of the project where the characters enter the museum before climbing into the paintings to interact with their fellow bops

Tools used: Autodesk Maya, Photoshop, Unity, Visual Studio, Trello.