Skip to main content

Complex data becomes easier to interpret when transformed into music

Published on 30.10.2023
Tampere University
Sonification of heart rate data. Photo: Jonathan Middleton.
A team of researchers on the field of human-technology interaction at Tampere University and Eastern Washington University have demonstrated how a custom-built “data-to-music” algorithms can help to better understand complex data. The transformation of digital data into sounds could be a game-changer in the growing world of data interpretation.

The five-year research was carried out by group of researchers at TAUCHI, the Tampere Unit for Computer-Human Interaction at Tampere University, Finland and Eastern Washington University, the United States. The research was funded by Business Finland.

The group recently released a research paper that provides reasons for using musical sounds in the transformation of data as a means to provide a new dimension for interpretation.

The lead author of the article is Jonathan Middleton, DMA, a professor of music theory and composition at Eastern Washington University, and a visiting researcher at Tampere University.  Middleton and his co-investigators were primarily concerned with showing how a custom-built “data-to-music” algorithms could enhance engagement with complex data points. In their research they used data collected from Finnish weather records.

“In a digital world where data gathering and interpretation have become embedded in our daily lives, researchers propose new perspectives for the experience of interpretation,” says Middleton.

According to him, the study validated what he calls a ‘fourth’ dimension in data interpretation through musical characteristics.

“Musical sounds can be a highly engaging art form in terms of pure listening entertainment and, as such, a powerful complement to theater, film, video games, sports, and ballet. Since musical sounds can be highly engaging, this research offers new opportunities to understand and interpret data as well as through our aural senses,” Middleton explains.

For instance, imagine a simple one-dimensional view of your heart rate data on graph. Then imagine a three-dimensional view of your heart rate data reflected in numbers, colors, and lines. Now, imagine a fourth dimension in which you can actually listen to that data. The key question in Middleton’s research is, which of those displays or dimensions help you understand the data best?

For many people, in particular businesses that rely on data to meet consumer needs, this rigorous validation study shows which musical characteristics contribute the most to engagement with data. As Middleton sees it, the research sets the foundation for using that fourth dimension in data analysis.

The scientific article Data-to-music sonification and user engagement was published in the journal Frontiers in Big Data on 10 August 2023.

See and hear Jonathan Middleton’s presentation based on the results of Data to music research.

Further information

Jonathan Middleton
+1 509 990 2461
jonathan.middleton [at]