I’m fascinated by the way sound and images combine, and how we perceive them. I like concerts because the music is accompanied by visual compliments, whether it is lighting, dancing, or (increasingly) digital images and videos on the big screens.
In the big leagues, these visualizations are high-quality animations, often specifically designed for the song that is playing. One that always comes to mind is the animation that accompanied Tiesto’s ‘Escape Me’ during his Kaleidoscope world tour. It was many years ago, and this was the best video I could find:
Amazing production, the visuals really complimented the song; but what about music in the mid-leagues or little-leagues? Is it possible to entertain and engage people with more accessible stuff? This is the avenue I would like to explore for this final project.
My goal is to create a dynamic music visualizer – an accessible sketch that detects something in the music data and provides visual feedback in real (or near-real) time.
The ‘something’ that it detects is remains to be determined. Some ideas for that ‘something’ include:
- A specific sound or instrument (e.g. horns)
- The intensity/climax/’drop’ of a song
- Typical sections of a song, such as the verses, chorus, or bridge as described here
The first hurdle to climb in order to make this a reality is to get intimate with the sound.js p5 library; it provides some interesting ways to turn sound into data, and that data into something visual. I expect to use FFT (fast Fourier transform) and its accompanying functions. Hopefully this library can help me detect one or more of the ‘somethings’ listed above.
The second challenge to overcome is management of the media – including its storage and loading/buffering. Using the p5 online web editor is not an option for many reasons including the fact that it only accepts media of a few megabytes or less.
To handle this, I plan to host the sketch on a server using node.js, so that it can be accessed anywhere in the world. I will also have to be careful about making the code ‘efficient’ and prevent lag. This was an issue even for a simple tint effect I wanted to make in a previous sketch.
A final challenge will be how flexible and functional the sketch will become. While I think the visual media of this Butter Churn Visualizer is lacking, I really like the following aspects:
- The interface – a simple full-page of media with clear and neat overlays on the bottom and top
- The functionality to select your own music – While the first versions of my final project will be based on a single – or small set of – music files, I hope to have the option for users to upload their own music. Other options would be to allow users to select songs from a service such as Spotify, but based on my colleague Brandon’s work on the topic, that seems like a project in and of itself.
For this project I am going to take some good advice and start simple: I want to achieve a ‘something’ with one or two tracks. I will first build it locally, then try taking it online. If (and only if?) this minimum is achieved, we can talk about more features.