Project Flight:
A Case Study of Terrain Generation, Audio Analysis, and Multithreading
built out of love, frustration, and the desire to create something that f*ing works.
Project Overview
Project Flight is a solo project that combines terrain generation, audio analysis, and multithreading. The game allows users to input any audio file, analyze it, and generate procedurally created tunnels for navigating a paper plane. This results in a unique, real-time gameplay experience tailored to the rhythm and structure of each song.
Project Owner
Myself
Role
Designer, Developer
Timeline
August 2019 – Present
Development Platform
Unity, Mobile
Skills Used
Game Design, UX/UI for Game Components, Instructional Content Creation, Graphic Design & Layout, Copywriting & Editing
Tools
Adobe Photoshop, Adobe Illustrator, Adobe InDesign, Google Sheets, Google Docs, Trello, Notion, Unity, Autodesk Maya.
Challenges
1. Creating Consistent, Engaging Terrain Generation
Ensuring that every audio input, regardless of genre or tempo, produced interesting and playable terrain, while maintaining a balance between aesthetic and functional gameplay.
2. Synchronizing Gameplay with Music
Designing the game to provide a seamless experience where the generated terrain changes in real-time as the player listens to the music, keeping the gameplay synchronized with the audio experience.
3. Differentiating Terrain by Music Type
Making sure that a variety of genres (e.g., rock, jazz) had distinct visual and gameplay impacts while remaining equally engaging and playable.
Research
During development, the project drew inspiration from rhythm games like Beat Saber and Crypt of the Necrodancer, as well as innovative prototypes encountered at the Boston Festival of Indie Games. Research involved exploring terrain generation methods and audio synchronization techniques, including tutorials on algorithmic beat mapping. A significant focus was placed on improving knowledge of C# and multithreading, with an iterative approach to testing and refining performance.
Challenges in terrain stability and rendering led to experimenting with compute shaders and profiling tools to understand bottlenecks. Ultimately, lessons learned from other games informed the decision to generate terrain as a tunnel structure to keep players engaged while maintaining performance integrity.
Design System
The colors and fonts for Project Flight have not been finalized, as the game’s visuals are still under development. The design will evolve alongside the gameplay to ensure that the aesthetic complements the experience, with a focus on maintaining clarity and immersion. As new elements are added and refined, the visual identity will be tailored to align with the game’s dynamic, music-driven environments.
The game design combines:
- Procedural Terrain Generation: Each terrain layout corresponds directly to audio inputs, with tunnels created based on song waveforms.
- Multithreading Implementation: The final build utilizes a Producer-Consumer Queue to split audio processing across multiple cores, significantly improving load times.
- Audio Frequency Analysis: Visualizers detect key frequency ranges (e.g., bass, midrange, treble) to influence terrain changes in sync with the song.
Challenges:
- Complexity of Multithreading: Implementing multithreading to ensure smooth loading and gameplay without delays was a major hurdle, as it required optimizing processes to run across multiple cores.
- Performance Bottlenecks: The original single-threaded design caused long loading times, especially on mobile devices, which risked player engagement.
- Procedural Terrain Generation: Ensuring that the terrain generated from audio files matched the beat and feel of the music while remaining enjoyable to navigate.
- Audio Analysis Integration: Building systems to extract meaningful patterns from audio files and convert them into visually interesting gameplay elements.
Goals:
- Create a smooth and engaging experience that converts player-selected music files into unique terrains.
- Implement multithreading to reduce load times across devices and maintain immersion.
- Develop audio analysis tools capable of detecting frequency ranges and syncing terrain changes to
beats in real time. - Explore Unity tools and shaders to refine terrain visuals and ensure smooth performance across platforms.
Scope of Work:
- Game Design & Development: Built core gameplay mechanics around audio-driven terrain generation and implemented gyroscope and keyboard controls for navigation.
- Multithreading Integration: Developed a multi-core queue system to reduce loading delays from 7–10 minutes to under 2 minutes.
- Performance Testing & Optimization: Conducted detailed profiling and iterative testing to resolve loading bottlenecks and ensure smooth mobile performance.
- Visual Design: Incorporated shaders and audio visualizers to enhance gameplay feedback and improve player immersion.
3. Game Design Process
The initial goal was to use keyboard controls or mobile gyroscope to tilt, and fly through a procedurally generated cave or tunnel created from your music, witnessed in real time as your song plays.
Visitors can interact with this embedded flight demo to experience the control mechanics first-hand. The demo allows them to fly a paper airplane, simulating how the game responds to user inputs with smooth WASD or mobile gyroscope controls. This gives a glimpse into the responsive gameplay mechanics.
Audio Spectrum Breakdown Using Fourier Transform:
Each song used in Project Flight is split into frequency bands using a Fast Fourier Transform (FFT). The frequencies are divided into distinct ranges—such as bass, midrange, and treble—and mapped to specific gameplay elements. For instance, terrain shifts or visual effects correspond directly to certain frequencies, creating a real-time interaction between the music and the game environment. This analysis enables an immersive experience where both the visuals and terrain feel synced with the audio.
Performance Optimization with Multithreading (Single vs. Multi-threaded Tests):
To ensure smooth gameplay and faster load times, the game uses multithreading. A Producer-Consumer Queue was implemented to distribute audio processing across multiple cores. The embedded comparison test showcases the performance improvement—shifting from single-threaded processing, which resulted in slower load times, to a multi-threaded approach that significantly reduced loading delays. This optimization ensures that the game remains responsive, even when handling complex audio-visual interactions.
Conclusion:
While Project Flight is still a work in progress, it demonstrates significant strides in merging audio analysis, procedural generation, and multithreading within a game environment. The experience offers a glimpse into future possibilities for interactive music-driven gameplay. Upcoming improvements will focus on refining UV mapping and further reducing load times to ensure terrain generation remains smooth and visually cohesive.