Estimation of Harpsichord Inharmonicity and Temperament from Musical Recordings
Abstract: Recent advances in music signal processing and the speed of desktop computers have facilitated the automation of many aspects of the analysis of music recordings. For example, the extraction of semantic metadata such as genre, key, chord and beat has been a major focus of the music informatics research community. Such work has applications in the areas of classification (e.g. organisation and navigation of music collections), recommendation (e.g. discovery and marketing of new music) and annotation (e.g. automatic transcription for education, musicological research, and music practice), and it complements traditional research methods in musicology, enabling more quantitative and larger scale analyses to be performed. For example, researchers and practitioners of early Western music debate the virtues of various tuning systems (temperaments) in terms of their theoretical properties, but are unable to substantiate (or refute) claims about performance practice with empirical data, having no means of measuring temperaments from musical recordings.