You can view the press release for the ASyMMuS project by clicking the link below:
Transforming Musicology blog post on music similarity
Dr Alan Marsden (Co-I for the ASyMMuS project) wrote a post on the Transforming Musicology blog entitled “Similarity: haven’t we heard this before somewhere?“. The post mentions the ASyMMuS project and its connections with other AHRC-funded projects:
http://transforming-musicology.org/blog/2015-04-09_similarity-havent-we-heard-this-before-somewhere/
DML Final Workshop (report from BL Digital Scholarship Blog)
The British Library Digital Scholarship team wrote a piece on the DML Final Workshop that took place on 13th March 2015. You can read the story here.
DML Final Workshop: Slides
The main webpage for the DML Final Workshop on Analysing Big Music Data has been updated with slides from the various presentations – more to come!
DML project at THATCamp British Library Labs
DML project members participated at the THATCamp British Library Labs, which took place on 13th February 2015 at the British Library. THATCamp stands for “The Humanities and Technology Camp”, that is an open, inexpensive meeting where humanists and technologists of all skill levels learn and build together in sessions pitched and voted on at the beginning of the day.
UPDATE: There is a report on the workshop at the BL Digital Scholarship blog.
As part of the workshop, we proposed a session entitled “Big Data for Musicology“. The session was well attended by both technologies and humanists, and led to a useful discussion on user requirements and issues regarding the creation of a system for automatic music analysis. Some of the discussion/feedback is summarised below:
On user requirements from a “Big Data for Music” system:
– Search/organise music collections by spatial information
– Coming with a “definitive” version of a song that has many cover versions; extrapolating information from various historical performances, and coming up with a “median” performance, and comparing different performances using mathematical models.
– Audio forensics for performance analysis?
– There may be a role from expert users rather than relying on a large crowd. Targeting that community of experts. Crowdsourcing could be used in order to make links between data.
On the chord visualisations demo:
– It is interesting to see that there are groupings of chords in a particular genre.
– Useful for music education, where you can see where your music sits in terms of a specific genre. Also where a piece sits in the music market.
– Could you have a playback feature? Where one could play representative tracks with specific chord patterns. Also link with music scores.
– Browse genres/tracks by colour or shape or pattern?
On music-related games with a purpose and the Spot the Odd Song Out game:
– How did you promote?
– Seems difficult trying to compete in the games market – it might be easier to target smaller groups/expert users.
– Having a music-related game can be more difficult than e.g. an image-based on, since it at least requires headphones/speakers.
Digital Music Lab Final Workshop on Analysing Big Music Data
Digital Music Lab Final Workshop on Analysing Big Music Data
13 March 2015, 10:00 – 16:30
Foyle Suite, Centre for Conservation
British Library
The final workshop of the DML project will take place at the British Library on 13 March 2015. Following short presentations and demos of project outputs and tools, the workshop will be dedicated to a hands-on guided session, in which the project’s analysis and visualisation tools will be applied to relevant large-scale music collections (including the British Library’s Sound Archive). Musicological insights obtained by the big data approach to these collections will be shared and discussed. The workshop will start at 10am (9am for registration and coffee) and finish at 4.30pm. Lunch and refreshments will be provided.
For more information on the workshop, including programme, registration, and venue information, please visit the workshop webpage.
ASyMMuS at Lorentz Center Leiden Workshop on Music Similarity
Several researchers from the ASyMMus and DML projects prominently contributed to the high-profile international workshop “Music Similarity: Concepts, Cognition and Computation“.
The workshop gathered experts on music similarity from Computer Science, Musicology, Music Psychology and related scientific fields. In a highly-motivated series of workgroups and talks, our researchers collaborated with other experts in the field in theoretical concepts and computer models of music similarity.
Main areas that were addressed :
* Relationship of similarity and categorisation
* Embedding similarity in context
* Perception and cognition of similarity
* Similarity modelling
* Similarity in music content – music analysis
* Similarity in music expression
Results include a roadmap for interdisciplinary music similarity research as well as future collaborations across scientific fields.
DML and ASyMMuS projects at DMRN+9 workshop
Current progress on the DML and ASyMMuS projects will be presented at the Digital Music Research Network Workshop 2014 (DMRN+9), taking place on Tuesday 16th December at Queen Mary University of London. The list of project-related presentations is as follows:
- “The ASyMMuS project: An integrated audio-symbolic model of music similarity”, Emmanouil Benetos, Daniel Wolff, Tillman Weyde (City University London), Nicolas Gold, Samer Abdallah (University College London) and Alan Marsden (Lancaster University)
- “Towards analysing big music data – Progress on the DML research project”, Tillman Weyde, Stephen Cottrell, Jason Dykes, Emmanouil Benetos, Daniel Wolff, Dan Tidhar, Alexander Kachkaev (City University London), Mark Plumbley, Simon Dixon, Mathieu Barthet, Steven Hargreaves (Queen Mary University of London), Nicolas Gold, Samer Abdallah (University College London), Aquiles Alancr-Brayner, Mahendra Mahey and Adam Tovell (The British Library)
DML at UCL Digifest
The first UCL Digifest was held last month, from 10th–14th November. Digifest is “a celebration of all things digital at UCL”, which for the UCL Music Systems Engineering Resarch Team (a.k.a MUSERT) , a.k.a Nicolas Gold, Samer Abdallah and Christodoulos Aspromalis) meant a chance to show off some of our recent activities as well as to call together the first meeting and performance of the UCL Laptop Orchestra, or UCLOrk.
The UCLOrk meeting (on Wednesday 12th) consisted of a tutorial session on computer music and building digital instruments using PureData, followed by a performance. The Computer Science technical support team worked wonders by building (in a very short space of time) 12 hemispherical speakers, so that each member of the orchestra could have their own speaker next to them—hemisphercal speakers are often used for laptop orchestras as they diffuse the sound better than ordinary speakers and mimic the effect of having many instruments distributed physically around the performance space. At the end of the session, orchestra members to their cues from an animated visual score created by Christodoulos—you can find a video here (complete with the occasional Mac volume changing sound…).
At the showcase session on Friday 14th, we demonstrated various music-related applications, including Christodoulos’s affective generative music system for computer games, the MiCLUES app (MiCLUES is a Share Academy/Arts Council England-funded project in collaboration with the Royal College of Music (RCM)) to guide museum visitors to the Museum of Instruments at the RCM., a device to help keep to the recommended 4 minute shower time limit, and a prototype of the DML information management system. The DML prototype allowed users to browse an RDF graph containing information about a local collection of audio files and symbolic scores, and then use this as a jumping off point for going out into the Semantic Web to pull in more information (via Linked Open Data and SPARQL endpoints), for example, from MusicBrainz (and LinkedBrainz) or DBpedia. Items in the symbolic music library could be accessed as machine readable scores (in several formats, such as MIDI, Humdrum/Kern, MusicXML, Lilypond), traditionally engraved scores (using Lilypond), or audio rendered from MIDI using Fluidsynth. The prototype also showed how a computational engine, (in this case using Matlab) can be integrated into the system, so that large scale musicological research can be conducted and the results managed.
Finally, running throughout the week was an application to collect visitor feedback, in the form of descriptive or mood-related words, and generate a music playlist reflecting the feedback. The playlist was compiled by using The Echonest web service to search for songs matching a subset of descriptive words, which were then added (after filtering out anything likely to be too offensive!) to a Spotify playlist using the Spotify API. If you have a spotify account, you can see the last state of the playlist under user name ‘ucldigifest’, playlist name ‘digifest’. It may change without warning at any time!
DML project at CIM 2014 conference
Current progress on the DML project will be presented at the 9th Conference on Interdisciplinary Musicology (CIM 2014). CIM will take place on 4-6 December in Berlin, Germany. Project-related papers are listed below (click titles to download abstracts):
- Mathieu Barthet, Mark Plumbley, Alexander Kachkaev, Jason Dykes, Daniel Wolff and Tillman Weyde, Big Chord Data Extraction and Mining
- Alexander Kachkaev, Daniel Wolff, Mathieu Barthet, Mark Plumbley, Jason Dykes and Tillman Weyde, Visualising Chord Progressions in Music Collections: A Big Data Approach