Ada Lovelace-inspired project presented at MobileHCI 2016 workshop

Ada Lovelace-inspired project presented at MobileHCI 2016 workshop

Professor David De Roure and Pip Willcox presented an Ada Lovelace inspired paper 'Numbers in Places: Creative Interventions in Musical Space & Time' at the Audio in Place workshop at ACM MobileHCI 2016 in Italy this September.

The goal of the Audio in Place workshop was to explore the possibilities, issues, challenges and application of methods for understanding, creating meaning and using audio content, for users of mobile devices. The workshop was part of MobileHCI 2016, the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, which this year was held in Tuscany from 7-9 September.

Nearly two centuries ago, Ada Lovelace noted that Charles Babbage's hypothetical Analytical Engine "might compose elaborate and scientific pieces of music of any degree of complexity or extent". As part of the Lovelace anniversary celebration in 2015, the Numbers into Notes project explored how this might have occurred nearly two centuries ago on the giant steam-powered machine using the mathematics of the time. An interactive web application tool was developed to generate a number sequence which are reduced using clock arithmetic and can then be mapped to notes, and the music explored by selecting fragments to play.

 

Now Professor De Roure and Pip Willcox (Head of the Centre for Digital Scholarship, Bodleian Libraries) are asking what Lovelace might do today—with an 'orchestra' of microcontrollers instead of the analytical engine.

Arduinos (programmable circuit boards with software) designed in the Oxford e-Research Centre are used to replicate the Numbers into Notes web application tool as a small standalone 'music engine'. These can then be controlled by participants using infrared remote controls and proximity sensors to select and map a subset of notes to individual instruments.

As the simulated analytical engine records people's behaviour, it could learn to make musical suggestions to complement and augment the compositions of the crowd. The devices can also be digitally coupled, machine-to-machine through infrared, Bluetooth, or indeed audio, so that compositions might evolve.

The research was supported by the FAST project (Fusing Audio and Semantic Technologies for Intelligent Music Production and Consumption), which is funded by the UK Engineering and Physical Sciences Research Council (EPSRC), in collaboration with Queen Mary University of London and the University of Nottingham; and by Transforming Musicology, a project funded by the UK Arts and Humanities Research Council (AHRC) in collaboration with Goldsmiths University of London.

 

See tweets from the MobileHCI conference @ACMMobileHCI.