The first UCL Digifest was held last month, from 10th–14th November. Digifest is “a celebration of all things digital at UCL”, which for the UCL Music Systems Engineering Resarch Team (a.k.a MUSERT) , a.k.a Nicolas Gold, Samer Abdallah and Christodoulos Aspromalis) meant a chance to show off some of our recent activities as well as to call together the first meeting and performance of the UCL Laptop Orchestra, or UCLOrk.
The UCLOrk meeting (on Wednesday 12th) consisted of a tutorial session on computer music and building digital instruments using PureData, followed by a performance. The Computer Science technical support team worked wonders by building (in a very short space of time) 12 hemispherical speakers, so that each member of the orchestra could have their own speaker next to them—hemisphercal speakers are often used for laptop orchestras as they diffuse the sound better than ordinary speakers and mimic the effect of having many instruments distributed physically around the performance space. At the end of the session, orchestra members to their cues from an animated visual score created by Christodoulos—you can find a video here (complete with the occasional Mac volume changing sound…).
At the showcase session on Friday 14th, we demonstrated various music-related applications, including Christodoulos’s affective generative music system for computer games, the MiCLUES app (MiCLUES is a Share Academy/Arts Council England-funded project in collaboration with the Royal College of Music (RCM)) to guide museum visitors to the Museum of Instruments at the RCM., a device to help keep to the recommended 4 minute shower time limit, and a prototype of the DML information management system. The DML prototype allowed users to browse an RDF graph containing information about a local collection of audio files and symbolic scores, and then use this as a jumping off point for going out into the Semantic Web to pull in more information (via Linked Open Data and SPARQL endpoints), for example, from MusicBrainz (and LinkedBrainz) or DBpedia. Items in the symbolic music library could be accessed as machine readable scores (in several formats, such as MIDI, Humdrum/Kern, MusicXML, Lilypond), traditionally engraved scores (using Lilypond), or audio rendered from MIDI using Fluidsynth. The prototype also showed how a computational engine, (in this case using Matlab) can be integrated into the system, so that large scale musicological research can be conducted and the results managed.
Finally, running throughout the week was an application to collect visitor feedback, in the form of descriptive or mood-related words, and generate a music playlist reflecting the feedback. The playlist was compiled by using The Echonest web service to search for songs matching a subset of descriptive words, which were then added (after filtering out anything likely to be too offensive!) to a Spotify playlist using the Spotify API. If you have a spotify account, you can see the last state of the playlist under user name ‘ucldigifest’, playlist name ‘digifest’. It may change without warning at any time!