ASyMMus at MTG Seminar, Universitat Pompeu Fabra Barcelona

The ASyMMus project and its integration into the DML web interface were presented by Daniel Wolff during his departmental talk on music similarity.

From the abstract:
The concept of similarity can be applied to music in a multitude of ways. Applications include systems which provide similarity estimates depending on the specific user and context as well as analysis tools that show similarity of music with regards to specified compositional, physical or contextual features. The ASyMMuS project allows musicologists to apply similarity analysis to musical corpora on a big-data infrastructure – allowing for a comparison of e.g. the works of a certain composer.

Read here for more information and the full abstract.

ASyMMuS Workshop on Audio-Symbolic Music Similarity Modelling

ASyMMuS Workshop on Audio-Symbolic Music Similarity Modelling

8 July 2015, 10:00 – 15:30
Foyle Suite, Centre for Conservation
British Library

The AHRC funded project on An Integrated Audio-Symbolic Model of Music Similarity (ASyMMuS) aims to integrate aspects of audio and symbolic representations, such as scores or MIDI data, in a joint model. By building on the Digital Music Lab structure, the project’s aim is to promote a data driven approach to music similarity. This workshop will bring together researchers with different approaches to promote discussions on what constitutes and what contributes to music similarity.

For more information on the workshop, including programme, registration, and venue information, please visit the workshop webpage.

DML project at THATCamp British Library Labs

thatcamp

DML project members participated at the THATCamp British Library Labs, which took place on 13th February 2015 at the British Library. THATCamp stands for “The Humanities and Technology Camp”, that is an open, inexpensive meeting where humanists and technologists of all skill levels learn and build together in sessions pitched and voted on at the beginning of the day.

UPDATE: There is a report on the workshop at the BL Digital Scholarship blog.

As part of the workshop, we proposed a session entitled “Big Data for Musicology“. The session was well attended by both technologies and humanists, and led to a useful discussion on user requirements and issues regarding the creation of a system for automatic music analysis. Some of the discussion/feedback is summarised below:

On user requirements from a “Big Data for Music” system:

– Search/organise music collections by spatial information
– Coming with a “definitive” version of a song that has many cover versions; extrapolating information from various historical performances, and coming up with a “median” performance, and comparing different performances using mathematical models.
– Audio forensics for performance analysis?
– There may be a role from expert users rather than relying on a large crowd. Targeting that community of experts. Crowdsourcing could be used in order to make links between data.

On the chord visualisations demo:

– It is interesting to see that there are groupings of chords in a particular genre.
– Useful for music education, where you can see where your music sits in terms of a specific genre. Also where a piece sits in the music market.
– Could you have a playback feature? Where one could play representative tracks with specific chord patterns. Also link with music scores.
– Browse genres/tracks by colour or shape or pattern?

On music-related games with a purpose and the Spot the Odd Song Out game:

– How did you promote?
– Seems difficult trying to compete in the games market – it might be easier to target smaller groups/expert users.
– Having a music-related game can be more difficult than e.g. an image-based on, since it at least requires headphones/speakers.

Digital Music Lab Final Workshop on Analysing Big Music Data

Digital Music Lab Final Workshop on Analysing Big Music Data

13 March 2015, 10:00 – 16:30
Foyle Suite, Centre for Conservation
British Library

The final workshop of the DML project will take place at the British Library on 13 March 2015. Following short presentations and demos of project outputs and tools, the workshop will be dedicated to a hands-on guided session, in which the project’s analysis and visualisation tools will be applied to relevant large-scale music collections (including the British Library’s Sound Archive). Musicological insights obtained by the big data approach to these collections will be shared and discussed. The workshop will start at 10am (9am for registration and coffee) and finish at 4.30pm. Lunch and refreshments will be provided.

For more information on the workshop, including programme, registration, and venue information, please visit the workshop webpage.

DML at UCL Digifest

The first UCL Digifest was held last month, from 10th–14th November. Digifest is “a celebration of all things digital at UCL”, which for the UCL Music Systems Engineering Resarch Team (a.k.a MUSERT) , a.k.a Nicolas Gold, Samer Abdallah and Christodoulos Aspromalis) meant a chance to show off some of our recent activities as well as to call together the first meeting and performance of the UCL Laptop Orchestra, or UCLOrk.

The UCLOrk meeting (on Wednesday 12th) consisted of a tutorial session on computer music and building digital instruments using PureData, followed by a performance. The Computer Science technical support team worked wonders by building (in a very short space of time) 12 hemispherical speakers, so that each member of the orchestra could have their own speaker next to them—hemisphercal speakers are often used for laptop orchestras as they diffuse the sound better than ordinary speakers and mimic the effect of having many instruments distributed physically around the performance space. At the end of the session, orchestra members to their cues from an animated visual score created by Christodoulos—you can find a video here (complete with the occasional Mac volume changing sound…).

At the showcase session on Friday 14th, we demonstrated various music-related applications, including Christodoulos’s affective generative music system for computer games, the MiCLUES app (MiCLUES is a Share Academy/Arts Council England-funded project in collaboration with the Royal College of Music (RCM)) to guide museum visitors to the Museum of Instruments at the RCM., a device to help keep to the recommended 4 minute shower time limit, and a prototype of the DML information management system. The DML prototype allowed users to browse an RDF graph containing information about a local collection of audio files and symbolic scores, and then use this as a jumping off point for going out into the Semantic Web to pull in more information (via Linked Open Data and SPARQL endpoints), for example, from MusicBrainz (and LinkedBrainz) or DBpedia. Items in the symbolic music library could be accessed as machine readable scores (in several formats, such as MIDI, Humdrum/Kern, MusicXML, Lilypond), traditionally engraved scores (using Lilypond), or audio rendered from MIDI using Fluidsynth. The prototype also showed how a computational engine, (in this case using Matlab) can be integrated into the system, so that large scale musicological research can be conducted and the results managed.

Finally, running throughout the week was an application to collect visitor feedback, in the form of descriptive or mood-related words, and generate a music playlist reflecting the feedback. The playlist was compiled by using The Echonest web service to search for songs matching a subset of descriptive words, which were then added (after filtering out anything likely to be too offensive!) to a Spotify playlist using the Spotify API. If you have a spotify account, you can see the last state of the playlist under user name ‘ucldigifest’, playlist name ‘digifest’. It may change without warning at any time!