A paper describing the relationships between Music Information Retrieval, Big Music Data, and musicology in relation to the analysis of recorded music and in particular the DML project was recently published at Musical Quarterly.
The paper, entitled “Big Music Data, Musicology, and the Study of Recorded Music: Three Case Studies” and authored by Stephen Cottrell (Professor of Music at City, University of London and Co-Investigator in the DML project), can be viewed by following the below link:
A paper describing the infrastructure of the Digital Music Lab framework has been published at the ACM Journal on Computing and Cultural Heritage (JOCCH). The paper is entitled “The Digital Music Lab: A Big Data Infrastructure for Digital Musicology” and can be viewed by following the below link:
A postprint version is also available to download at:
The Digital Music Lab project was mentioned at the 2016 newsletter of the International Musicological Society (IMS), on the “Study Group on Digital Musicology” section (p.23), regarding:
Below we provide a list of git/mercurial/package repositories where we publish the code underlying the DML system. All code is hosted in mercurial repositories at code.soundsoftware.ac.uk under the GPLv3 license.
The cliopatria repository contains the implementation of the information and results management system and API. The source code can be found here:
hg clone https://code.soundsoftware.ac.uk/hg/dml-open-cliopatria.
The source for the DML Vis is hosted here:
hg clone https://code.soundsoftware.ac.uk/hg/dml-open-vis.
We welcome contributions towards the code. If you use the code for a scientific publication, you can cite […]. Tools to work with the mercurial version control system are available at https://www.mercurial-scm.org/, with a GUI at EasyMercurial.
The DML Vis interface is now available online. It enables you to explore, analyse and compare music collections and recordings from three large libraries originating from the British Library’s Sound Archives, CHARM and I Like Music.
We invite you to play with the interface: http://dml.city.ac.uk/vis/ and have a look at our introduction.
Furthermore we provide access to the analysis and features used in the DML interface via our ClioPatria service. Here you may browse the triplet store by predicates such as bl composer (e.g. for classical music) or subject which is suited well for ethnographic recordings. We are happy to receive feedback.
The DML project will be presented at the “Numbers, Noises and Notes: Quantitative Data and Music Research” symposium, which takes place on Tuesday 16th June at The Sussex Humanities Lab, University of Sussex.
During the symposium, Dr Tillman Weyde (PI for the DML project) will give a talk on “Analysing Big Music Data: Audio Transcription and Pitch Analysis of World and Traditional Music”.
The British Library Digital Scholarship team wrote a piece on the DML Final Workshop that took place on 13th March 2015. You can read the story here.
The main webpage for the DML Final Workshop on Analysing Big Music Data has been updated with slides from the various presentations – more to come!
DML project members participated at the THATCamp British Library Labs, which took place on 13th February 2015 at the British Library. THATCamp stands for “The Humanities and Technology Camp”, that is an open, inexpensive meeting where humanists and technologists of all skill levels learn and build together in sessions pitched and voted on at the beginning of the day.
UPDATE: There is a report on the workshop at the BL Digital Scholarship blog.
As part of the workshop, we proposed a session entitled “Big Data for Musicology“. The session was well attended by both technologies and humanists, and led to a useful discussion on user requirements and issues regarding the creation of a system for automatic music analysis. Some of the discussion/feedback is summarised below:
On user requirements from a “Big Data for Music” system:
– Search/organise music collections by spatial information
– Coming with a “definitive” version of a song that has many cover versions; extrapolating information from various historical performances, and coming up with a “median” performance, and comparing different performances using mathematical models.
– Audio forensics for performance analysis?
– There may be a role from expert users rather than relying on a large crowd. Targeting that community of experts. Crowdsourcing could be used in order to make links between data.
On the chord visualisations demo:
– It is interesting to see that there are groupings of chords in a particular genre.
– Useful for music education, where you can see where your music sits in terms of a specific genre. Also where a piece sits in the music market.
– Could you have a playback feature? Where one could play representative tracks with specific chord patterns. Also link with music scores.
– Browse genres/tracks by colour or shape or pattern?
On music-related games with a purpose and the Spot the Odd Song Out game:
– How did you promote?
– Seems difficult trying to compete in the games market – it might be easier to target smaller groups/expert users.
– Having a music-related game can be more difficult than e.g. an image-based on, since it at least requires headphones/speakers.