DML paper published at Musical Quarterly

A paper describing the relationships between Music Information Retrieval, Big Music Data, and musicology in relation to the analysis of recorded music and in particular the DML project was recently published at Musical Quarterly.

The paper, entitled “Big Music Data, Musicology, and the Study of Recorded Music: Three Case Studies” and authored by Stephen Cottrell (Professor of Music at City, University of London and Co-Investigator in the DML project), can be viewed by following the below link:
https://academic.oup.com/mq/advance-article/doi/10.1093/musqtl/gdy013/5235404?searchresult=1

DML paper published at ACM JOCCH

A paper describing the infrastructure of the Digital Music Lab framework has been published at the ACM Journal on Computing and Cultural Heritage (JOCCH). The paper is entitled “The Digital Music Lab: A Big Data Infrastructure for Digital Musicology” and can be viewed by following the below link:
http://dl.acm.org/citation.cfm?id=2983918

A postprint version is also available to download at:
https://qmro.qmul.ac.uk/xmlui/handle/123456789/15701

DML mention at International Musicological Society Newsletter

The Digital Music Lab project was mentioned at the 2016 newsletter of the International Musicological Society (IMS), on the “Study Group on Digital Musicology” section (p.23), regarding:

DML Code Repositories

Below we provide a list of git/mercurial/package repositories where we publish the code underlying the DML system. All code is hosted in mercurial repositories at code.soundsoftware.ac.uk under the GPLv3 license.

The cliopatria repository contains the implementation of the information and results management system and API. The source code can be found here:

hg clone https://code.soundsoftware.ac.uk/hg/dml-open-cliopatria.

The source for the DML Vis is hosted here:

hg clone https://code.soundsoftware.ac.uk/hg/dml-open-vis.

We welcome contributions towards the code. If you use the code for a scientific publication, you can cite […]. Tools to work with the mercurial version control system are available at https://www.mercurial-scm.org/, with a GUI at EasyMercurial.

Explore the DML Vis Interface

Image

The DML Vis interface is now available online. It enables you to explore, analyse and compare music collections and recordings from three large libraries originating from the British Library’s Sound Archives, CHARM and I Like Music.

Bildschirmfoto 2015-06-11 um 13.08.49

We invite you to play with the interface: http://dml.city.ac.uk/vis/ and have a look at our introduction.

Furthermore we provide access to the analysis and features used in the DML interface via our ClioPatria service. Here you may browse the triplet store by predicates such as bl composer (e.g. for classical music) or subject which is suited well for ethnographic recordings. We are happy to receive feedback.

DML project at FMA 2015

Work carried out on analysing world and traditional music as part of the DML project will be presented at the 5th International Workshop on Folk Music Analysis (FMA 2015). FMA will take place on 10-12 June in Paris, France. Project-related papers are listed below:

  • S. Abdallah, A. Alencar-Brayner, E. Benetos, S. Cottrell, J. Dykes, N. Gold, A. Kachkaev, M. Mahey, D. Tidhar, A. Tovell, T. Weyde, and D. Wolff, “Automatic transcription and pitch analysis of the British Library World & Traditional Music Collection”
  • A. Leroi, M. Mauch, P. Savage, E. Benetos, J. P. Bello, M. Panteli, J. Six, and T. Weyde, “The deep history of music project”

DML project at “Quantitative Data and Music Research” symposium

The DML project will be presented at the “Numbers, Noises and Notes: Quantitative Data and Music Research” symposium, which takes place on Tuesday 16th June at The Sussex Humanities Lab, University of Sussex.

During the symposium, Dr Tillman Weyde (PI for the DML project) will give a talk on “Analysing Big Music Data: Audio Transcription and Pitch Analysis of World and Traditional Music”.

DML project at THATCamp British Library Labs

thatcamp

DML project members participated at the THATCamp British Library Labs, which took place on 13th February 2015 at the British Library. THATCamp stands for “The Humanities and Technology Camp”, that is an open, inexpensive meeting where humanists and technologists of all skill levels learn and build together in sessions pitched and voted on at the beginning of the day.

UPDATE: There is a report on the workshop at the BL Digital Scholarship blog.

As part of the workshop, we proposed a session entitled “Big Data for Musicology“. The session was well attended by both technologies and humanists, and led to a useful discussion on user requirements and issues regarding the creation of a system for automatic music analysis. Some of the discussion/feedback is summarised below:

On user requirements from a “Big Data for Music” system:

– Search/organise music collections by spatial information
– Coming with a “definitive” version of a song that has many cover versions; extrapolating information from various historical performances, and coming up with a “median” performance, and comparing different performances using mathematical models.
– Audio forensics for performance analysis?
– There may be a role from expert users rather than relying on a large crowd. Targeting that community of experts. Crowdsourcing could be used in order to make links between data.

On the chord visualisations demo:

– It is interesting to see that there are groupings of chords in a particular genre.
– Useful for music education, where you can see where your music sits in terms of a specific genre. Also where a piece sits in the music market.
– Could you have a playback feature? Where one could play representative tracks with specific chord patterns. Also link with music scores.
– Browse genres/tracks by colour or shape or pattern?

On music-related games with a purpose and the Spot the Odd Song Out game:

– How did you promote?
– Seems difficult trying to compete in the games market – it might be easier to target smaller groups/expert users.
– Having a music-related game can be more difficult than e.g. an image-based on, since it at least requires headphones/speakers.