Ben Foord (2033179)
Music Visualizer

Project Abstract
My project explores the potential of real-time music visualization as a tool for enhancing musical understanding, particularly for those without formal musical training. As digital media becomes increasingly interactive and immersive, there is growing interest in how audio-visual systems can make abstract musical concepts more accessible and intuitive. The aim of this research is to develop a system that visualizes key musical features, revealing the underlying structure and expressive content of a piece. This work positions itself as a creative and technical exploration at the intersection of music analysis and computer graphics. The visualizer is built in Python using OpenGL and presents a dynamic particle cloud that responds to a range of extracted audio features. These include FFT for frequency content, MFCCs for timbral characteristics, beat detection for rhythmic patterns, and Tonnetz analysis for harmonic relationships. These data streams drive the movement, color, and behavior of particles in the system, creating a multi-dimensional visual reflection of the music. The resulting system reveals trends in musical texture, rhythm, and harmony through evolving patterns and motion within the visualizer. It produces an abstract but information-rich representation of music that changes in real time with the audio. This work contributes a novel method for representing music visually, demonstrating how computational feature extraction and real-time graphics can combine to make the inner structure of music more perceptible—especially for those who may not engage with traditional notation or theory.
Keywords: Music Visualization, Audio Feature Extraction, Generative Graphics
Conference Details
Session: A
Location: Sir Stanley Clarke Auditorium at 11:00 13:00
Markers: Xianghua Xie, Eike Neumann
Course: BSc Software Engineering 3yr FT
Future Plans: I’m looking for work