The words MUSEUM SENSES are framed by 2 columns. Concentric circles radiate from the left column. Dotted lines extend from the right column.

Exploring scientific concepts and data with sonification

Exploring scientific concepts and data with sonification

Share This Article

This post is the second in a series about sonification, representing data with nonspeech audio. I will demonstrate that sonification is an effective tool for learning scientific concepts and analyzing data giving examples from the field of astronomy.

The Chandra Photo Album Sonification Collection hosted by the Harvard-Smithsonian Center for Astrophysics shows that sonification can both convey information about cosmic images and express feelings of wonder evoked by these images. Moving from sonification as storytelling to uses of sonification in research, the Astronify project creates audio and visual displays of telescope observations that astronomers use to detect exoplanets outside of our solar system.

In my post Introducing Sonification, I discussed using sound to enhance storytelling. A classic example of this technique is the children’s story “Peter and the Wolf”. Peter is represented by strings. Other characters are introduced with different musical instruments like woodwinds, brass, and percussion.

Astronomers, musicians and data scientists create sonification using a similar storytelling technique. A sonification of the center of the Milky Way galaxy combines data from NASA’s Chandra X-ray Observatory, Hubble Space Telescope, and Spitzer Space Telescope. I will describe it in some detail before providing the link where you can play or download video or audio of the sonification.

Each telescope reveals different phenomena and is represented by a different instrument. The piano was chosen for Spitzer, strings for Hubble, and the chimes of the glockenspiel for Chandra. 

The Chandra photo album includes a detailed description of the image and the sonification of the center of the Milky Way galaxy. I am quoting from their description because it is rich with vivid details.

The sonification “begins on the left side of the image and moves to the right, with the sounds representing the position and brightness of the sources. The light of objects located towards the top of the image are heard as higher pitches while the intensity of the light controls the volume. Stars and compact sources are converted to individual notes while extended clouds of gas and dust produce an evolving drone. The crescendo happens when we reach the bright region to the lower right of the image. This is where the 4-million-solar-mass supermassive black hole at the center of the Galaxy, known as Sagittarius A* (A-star), resides, and where the clouds of gas and dust are the brightest.”

Beyond expressing the wonder of a complicated image, the sonification of the center of the Milky Way demonstrates that Sound can be used to explain scientific concepts. This sonification includes changes in pitch that represent different wavelengths in the electromagnetic spectrum: infrared, optical, and x-ray.

Low pitches are assigned to longer wavelengths, and higher  pitches represent shorter wavelengths. The low frequencies are for the infrared wavelengths. The mid-range pitches are in the optical range of the electromagnetic spectrum that is visible to the human eye. The higher-pitched chimes are x-ray wavelengths that are only detectable with a telescope.

Visit the Chandra photo album to play and download audio or video files. The videos show animations made from the images and the sonification is the accompanying audio track.

Note that if you are navigating the Chandra photo album website with a keyboard and using a screen reader (voice output), the page does not have headings. An embedded video player for the Galactic Center Sonification is immediately below a long list of links that allow filtering of the photo album by date or by subject.

In the same way that sonification provides a soundscape for the complicated cosmic images in the Chandra Photo Album, it is an effective tool for analyzing scientific data.

Astronify is a software package developed by the Space Telescope Science Institute that uses the Python programming language to turn telescope observations into sound. After installing the software package, users download astronomical data and execute a command to render it as a sonification. They can also display the data visually as a graph.

Astronify can be used to detect exoplanets as they orbit stars outside of our solar system. Visually, the light from a star is interrupted when the planet moves between the star and the lens of the telescope. Astronomers call this a transit. Sonically, astronify represents time as a constant drone, observations with change in pitch, and the transit as an interruption to the sounds for time and pitch.

Visit the Astronify project website to play examples of sonified data and try the games to test your ability to understand the data by ear. The example recordings of sonification, other media, and games are located below the documentation and tutorials for installing and running the Astronify program.

I am providing links to media coverage of the creators of sonifications highlighted in this post.

Listen to NASA Is Figuring Out How Outer Space Sounds  on the National Public Radio (NPR) program Weekend edition. This interview with Kim Arcand from the Chandra observatory was broadcast Saturday, May 28, 2022.

Space Sonification is a podcast episode of Out of the Blocks published on September 14, 2020. The episode features Astronify team members discussing the contrast between their isolation during covid lockdown and working on cosmic phenomenon.

The examples that I provide here show that sonification is an effective tool to communicate scientific concepts and explore data. In future posts, I will give a brief overview of some web-based tools to produce sonification and share my thoughts about how museums can use sonification as an audio equivalent to a visual display of content.

Skip to content