Centre musicologists and researchers in FAST show and tell session

Centre musicologists and researchers in FAST show and tell session

Dr Kevin Page, John Pybus and Mat Wilcoxson took part in the FAST project Show and Tell on Thursday 26 April, part of the Ideas Unwrapped event at Queen Mary University. The Show and Tell consisted of live demonstrations showcasing some of the project’s most exciting research related to semantic web and audio technologies.

Humanities Lead John Pybus presented the Numbers-into-Notes Semantic Remixer. Numbers into Notes is a software-based interpretation of Ada Lovelace’s theorizing on the properties of Charles Babbage’s Analytical Engine.  It has been used in a range of applications, including to  create the soundtrack for the film ‘Into the Looking Glass – how selfie culture is preparing us to meet our future selves’ created by Dr Alan Chamberlain (University of Nottingham), which was nominated for a British Universities Film & Video Council (BUFVC) Learning on Screen Award.

The Prism audience perception app, which gathers audience feedback during live performance, was presented by Software Engineer Mat Willcoxson. The app, developed in collaboration with colleagues in Oxford and the Royal Northern College of Music, has been used in live public experiments in Manchester and Oxford to investigate human perception of musical features. As well as supporting research, these have acted as public engagement events to engage audiences with music, maths, and particular composers.

Senior Researcher Dr Kevin Page (pictured right) introduced The Climb! performance and score archive (MELD and Muzicodes). “Climb!” is a non-linear musical work for Disklavier and electronics in which the pianist’s progression through the piece is not predetermined, but dynamically chosen according to scored challenges and choices. The challenges are implemented using two FAST technologies — Muzicodes and MELD (Music Encoding and Linked Data) — which are also used to create an interactive archive through which recorded performances of Climb! can be explored.

Professor David De Roure is Co-Investigator on the Fusing Audio and Semantic Technologies for Intelligent Music Production and Consumption (FAST) project and Centre researchers have been involved with it in varying ways. Research Associate Dr David Weigl says of the project: “The cross-disciplinary, multi-institutional nature of the project, bringing together collaborative research into digital signal processing, data modelling, incorporation of semantic context, and ethnographic study across all stages of the music lifecycle, from production to consumption – combined with a long-term outlook, seeking to predict and inform the shape of the music industry over the coming decades – make FAST a particularly exciting project to be involved in”.

 

More information about the Live Demonstrations http://www.semanticaudio.ac.uk/news/fast-show-tell-session-ideas-unwrapped/