Music's changing fast: FAST is changing music
Music's changing fast: FAST is changing music
Showcasing the culmination of five years of digital music research, the FAST IMPACt project (Fusing Audio and Semantic Technologies for Intelligent Music Production and Consumption) led by Queen Mary University of London hosted an invite only industry day at Abbey Road Studios on Thursday 25 October, 2 – 8 pm.
The FAST project brings together labs from three UK’s top Universities: Queen Mary¹s Centre for Digital Music, University of Nottingham's Mixed Reality Lab and the University of Oxford¹s e-Research Centre.
Presented by Professor Mark Sandler, Director of the Centre for Digital Music at Queen Mary, the event showcased to artists, journalists and industry professionals the next generation technologies that will shape the music industry – from production to consumption. The FAST Industry Day was opened by Lord Tim Clement-Jones (Chair of Council, Queen Mary University of London) and was compered by Professor Mark d’Inverno (Professor of Computing at Goldsmiths College, London).
FAST is looking at how new technologies can positively disrupt the recorded music industry. Research from across the project was presented to the audience, with work from partners at the University Nottingham and the University of Oxford's e-Research Centre presented alongside that from Queen Mary. The aim being that by the end of the FAST Industry day, people would gain some idea how AI and the Semantic Web can couple with Signal Processing to overturn conventional ways to produce and consume music. Along the way, industry attendees were able to preview some cool and interesting new ideas, apps and technology that the FAST team showcased.
Talks, demos, performance, and discussion
One hundred and twenty attendees were treated to an afternoon and evening of talks, demonstrations, the Climb! performance, and an expert panel discussion with Jon Eaves (The Rattle), Paul Sanders (state51), Peter Langley (Origin UK), Tracy Redhead (award-winning musician, composer and interactive producer, University of Newcastle, Australia), Maria Kallionpää (composer and pianist, Hong Kong Baptist University), chaired by Mark d’Inverno (Goldsmiths).
Rivka Gottlieb, harpist and music therapist, performed some musical pieces based on her collaboration with PI Professor David de Roure and the Centre's project ‘Numbers into Notes’, throughout the day. Other speakers included George Fazekas who outlined the Audio Commons Initiative, Tracy Redhead and Florian Thalmann who presented their work on the semantic player technologies and Ben White who spoke about the Open Music Archive project (exploring the the intersection betweeen art, music and archives).
Highlights of the day include:
Carolan Guitar - Connecting Digital to the Physical – The Carolan Guitar tells its own story. Play the guitar, contribute to its history, scan its decorative patterns and discover its story. Carolan uses a unique visual marker technology that enables the physical instrument to link to the places it has been, the people who’ve played it and the songs it has sung, and deep learning techniques to better event detection.
FAST DJ – Fast DJ is a web-based automatic DJ system and plugin that can be embedded into any website. It generates transitions between any pair of successive songs and uses machine learning to adapt to the user’s taste via simple interactive decisions.
Grateful Dead Concert Explorer – A Web service for the exploration of recordings of Grateful Dead concerts, drawing its information from various Web sources. It demonstrates how Semantic Audio and Linked Data technologies can produce an improved user experience for browsing and exploring music collections. See Thomas Wilmering explaining more about the Grateful Dead Concert explorer.
Jam with Jamendo – Jam with Jamendo brings music learners and unsigned artists together by recommending suitable songs as new and varied practice material. In this web app, users are presented with a list of songs based on their selection of chords. They can then play along with the chord transcriptions or use the audio as backing tracks for solos and improvisations. Using AI-generated transcriptions makes it easy to grow the underlying music catalogue without human effort. See Johan Pauwels explaining more about Jam with Jamendo .
MusicLynx – a web platform for music discovery that collects information and reveals connections between artists from a range of online sources. The information is used to build a network that users can explore to discover new artists and how they are linked together.
The SOFA Ontological Fragment Assembler – enables the combination of musical fragments – Digital Music Objects, or DMOs – into compositions, using semantic annotations to suggest compatible choices.
Numbers into Notes – experiments in algorithmic composition and the relationship between humans, machines, algorithms and creativity. See David de Roure explaining more about the research.
rCALMA Environment for Live Music Data Science – a big data visualisation of key in the Live Music Archive using Linked Data to combine programmes and audio feature analysis. See Oxford e-Research Centre's David Weigl talking about rCALMA.
Climb! Performance Archive – Climb! is a non-linear composition for Disklavier piano and electronics. This web-based archive creates a richly indexed and navigable archive of every performance of the work, allowing audiences and performers to engage with the work in new ways.
Follow FAST on Twitter @semanticaudio
Article courtesy of FAST.