Automatically extracting harmony from recorded music â€” and what to do with it
I’ll give a short talk on my work on harmony in the past few years next Wednesday in Utrecht. The talk is part of the Symposium on Harmony and Variation in Music Information Retrieval organised by Bas de Haas as a satellite event to his PhD viva. Quite excitingly, some big names of MIR will be there.
Thanks to my collaborations I have a lot of juicy stuff to talk about:
- the SongPrompter application I did at AIST with Hiromasa Fujihara and Masataka Goto,
- the amazing Songle.jp web service, a big project at AIST to which I contributed chord, key and beat/bar estimates
- and Last.fm’s Driver’s Seat, a collaboration with Mark Levy and Sven Over
already span quite a wide range of MIR applications, and I will also show a bit of Yanno, Dan Stowell’s education project based on YouTube and my Chordino Vamp plugin.Â Two things link these applications:
- they very closely involve human interaction, some for music making, some for song browsing, some for music discovery in collections
- they integrate many MIR techniques into one multi-faceted application
And, of course, they all involve harmony. SongPrompter combines lyrics and chord alignment based on MFCC and chroma features, chords are a central part of Yanno and Songle, too. In Last.fm’s Driver’s Seat the connection is not as obvious, but some of its most enjoyable (to me) features are the search for harmonic creativity (based on my the Structural Change feature), for gear shifts, or simply harmonic smoothness.
So I’m looking forwards a lot â€“ check back for my final slides. [Edit: the slides, including references, are now here.]