It’s Just a Phase
Can we do time-stretching…in real-time? What if instead of just stretching time, we could freeze it, scrub through it, and morph between different moments in a sound?
We will explore this question, and more, by trying to build out some creative ideas for synthesis armed with just a few samples and the trusty FFT.
We will cover the basics of time-stretching audio using the FFT with phase-vocoding, and in doing so try to build an intuition of how phases in a Fourier Transform affect the sound with some beautiful interactive plots.
We'll then try to build out a full synth using some of these (usually offline) time-stretching techniques - a waveform player where you can scrub over the sound at any speed, use a playhead to pause, freezing the sound, then randomly move around. We will introduce some pitch shifting, and show that phase-vocoding offers a texturally rich and organic alternative to granular synthesis - no grain boundaries or windowing artifacts.
Through the whole talk I'll try to answer questions like:
- What the heck even ARE phases?
- How do we handle phase coherence differently for onsets versus harmonic sounds?
- How do we make processed noise sound natural instead of artificial?
- How can we use these algorithms in real-time synthesis?
Throughout, we'll dig into practical implementation details - you'll come away with some tools for using FFTs in ways you might not have expected, and techniques for making these algorithms work in real-time. I’ll also provide a link to an open-source library that uses these techniques.
Target Audience: Audio developers with basic FFT knowledge interested in creative synthesis applications and practical DSP implementation.

Cameron Thomas
Hi I'm Cam,
I'm a part-time staff engineer at Vochlea Music and Contract Audio Plugin Developer.
I like experimental and ambient music especially, and love finding interesting ways of processing sound.