Music research, particularly in fields like systematic musicology, ethnomusicology, or music psychology, has developed as “data oriented empirical research”, which benefits from computing methods. However, this music research has so far been limited to relatively small datasets, because of technological and legal limitations. On the other hand, researchers in Music Information Retrieval (MIR) have started to explore large datasets, particularly in commercial recommendation and playlisting systems, but there are differences in the terminologies, methods, and goals between MIR and musicology as well as technological and legal barriers. The proposed Digital Music Lab will support music research by bridging the gap to MIR and enabling access to large music collections and powerful analysis and visualization tools.
The Digital Music Lab project will develop research methods and software infrastructure for exploring and analysing large-scale music collections. A major output of the project will be a service infrastructure with two prototype installations. One installation will enable researchers, musicians and general users to explore, analyse and extract information from music recordings stored in the British Library. Another installation will be hosted at Queen Mary University of London and provide facilities to analyse audio collections such as the I Like Music, CHARM and the Isopohnics datasets, creating a data collection of significant size (over 1 million pieces). We will provide researchers with the tools to analyse music audio, scores and metadata. The combination of state-of-the art music analysis on the audio and the symbolic level with intelligent collection-level analysis methods will allow for exploration and quantitative research on music that has not been possible at this scale so far. The use of the proposed framework will be demonstrated in musicological research on classical music, as well as in folk, world and popular music. The results of these analyses will be made available in the form of highly interactive visual interfaces and will be made available as open data/open source software.