Skip to content Accessibility statement

New digital resource matches business brand values to signature music

Posted on 29 June 2018

Researchers at the University of York have developed an algorithm that matches the brand characteristics of a business or industry to a particular type of music to help improve the impact of marketing methods to potential clients and customers.


The algorithm can predict the brand-fit of music with an accuracy of 80.1%.

The team, working with the Technical University of Berlin and the Audio Branding company HearDis!, recruited 10,000 participants across England, Germany and Spain to listen to different types of music and attach particular describing words to the recording.

Participants were further characterised in their responses as being from particular social economic groups, as well as by age, gender, and ethnicity.  A computer programme was then trained to use all of this information to predict how an individual would react to any new piece of music that was inputted into the system. 

Expensive exercise

Dr Hauke Egermann, a music psychologist from the University of York’s Department of Music, said: “Companies looking to promote themselves on radio, television or podcasts for example, often hire in a team of professionals to link their brand values with appropriate visual images and music.  

“Experts in their field are employed to analyse music and its impact on the listener, but music experiences are highly subjective and so what an expert might say is the right sound to elicit the right response from a company’s audience, might not be what the end-listener actually perceives. 

“It is an expensive exercise, so we have worked to make this process quicker and more accurate by developing an algorithm that matches certain common brand words with various types of music. Most importantly the system is built on the findings from listener experience, not music or marketing experts.” 

Brand-fit

A company, for example may input words such as warm, young and progressive into the computer system; these words will then be matched with a piece of music selected by the system. Companies may then chose to use the music for their marketing activities or use the search result as a guide for a composer to come up with something new. 

Dr Jochen Steffens, from the Technical University of Berlin, said: “The ABC_DJ recommendation algorithm can predict the brand-fit of music or perceived musical expression with an accuracy of 80.1%. 

“The theoretical maximum value of 100% can never be reached, because people are and will always have a different reaction to music; this means that 80% match will be exceptionally valuable to the industry. 

“Our overall aim is to provide European creative agencies that are active in the field of audio branding, with sophisticated tools that support the process of creating music and fostering audio branding campaigns.”  

Shop trial

The team have also trialled the system with a music player in a shop environment, where music is particularly important to set the mood of the customer experience. 

The new system is not yet available for individual consumers of music, who may want to match music to their particular types of mood or personal occasions, for example, but researchers say the system has the potential for many different uses in the future. 

It is anticipated that the system, which is yet to be named, will be available to companies in the next 12 months. 

Explore more news

Media enquiries

Samantha Martin
Deputy Head of Media Relations (job share)

Tel: +44 (0)1904 322029

About this research

The ABC_DJ project has received funding from the European Union’s Horizon 2020  research and innovation programme and comprises several companies and research institutions from different European countries. 

Click here to listen to a music excerpt that was predicted by the algorithm to sound bright, playful, and funny, and here for an excerpt that was predicted to sound loving, friendly, and warm.

For more information about the ABC_DJ visit www.abcdj.eu

Expore our research.