Can Multisensory Experiences of the Solar Eclipse influence the creation of accessible STEM content for everyone?
Before I share my thoughts on this question, I will describe an innovative project that created an audio experience of the solar eclipse on April 8, 2024.
A Tale of Two Eclipses, 2017, 2024
In 2017, I encountered articles in the media breathlessly exclaiming over a smartphone app that could translate the eclipse into sound.
Eclipse Soundscapes is a free iOS and Android application that provides information about solar eclipses. The app includes high-resolution images that play different sounds and give haptic feedback as the user moves their finger across the image. Text-based descriptions accompany each image.
I downloaded the Eclipse Soundscapes app in 2017. Sometime before the eclipse, I explored the multisensory content, but I did not open the app during the event. The word “passive” best describes my experience of the 2017 eclipse.
Fast forward to spring 2024, when I read several articles about The LightSound Project. Scientists at the Harvard/Smithsonian Center for Astrophysics designed and built devices that would translate the visual experience of the eclipse into an audio mode.
The LIGHTSOUND device is innovative because it works in real time. Sensors detect the intensity of the light, and the device hardware matches the variation in light levels with several different sounds.
The bright light before the eclipse is represented by a flute. As the light dims, the sound decreases in pitch and is represented by a clarinet.
When totality occurs, the tones are replaced by a clicking sound that lasts as long as the moon is between the sun and the earth blocking the light. As the sunlight reappears, the tones return, and they rise in pitch.
I experienced the 2024 eclipse both by listening to a live stream along the path of totality nationwide, and by going outside at my location in Baltimore Maryland where we had a partial eclipse blocking about 85% of the sunlight.
The American Council of the Blind, ACB Media, and Harvard/Smithsonian Center for Astrophysics broadcast the eclipse as it reached totality at different times from Texas through Vermont. People connected their LIGHTSOUND devices to Zoom. The broadcast of the eclipse was successful from Texas, Indiana, Ohio and New York, but the Vermont location had technical difficulties.
As I listened to the broadcast of LIGHTSOUND devices along the path of totality, I noticed variation in the timing of the tones, and the volume and speed of the clicks. The frequency of the sound is controlled in real time by the intensity of the light which varies because of the motion of the Earth and the moon. The Moon continues to circle the Earth throughout the eclipse, and the Earth rotates on its axis.
At the same time as I was listening to broadcasts from the path of totality, I heard people outside at my location in Baltimore Maryland. Small groups were gathered on the balconies of my building to view the eclipse.
My balcony faces south, and it is usually sunny. During the eclipse, the temperature dropped. My balcony was cool and shady.
I experienced the 2024 solar eclipse in two modes, listening to sonification from the path of totality on zoom, and by feeling the temperature drop in-person. Based on my experience in 2017, I was not expecting to have this level of interactivity during the 2024 eclipse.
The Benefits of Multisensory Experiences
Sonification levels the playing field for blind people. The technique allows us to hear data and analyze it independently without relying on the description of a sighted person. I have written about research published in Nature Astronomy, and in other scientific fields. You can learn more by reading my older posts on sonification.
The LIGHTSOUND devices can be used for other tasks beyond detecting the eclipse. For example, they could make a sound to indicate the status of lights on laboratory instruments.
I can imagine an exhibit designer using sonification to indicate the status of a light on a display panel. I have encountered thousands of buttons that light up, but none of them make a sound.
The media coverage of both LIGHTSOUND in 2024, and the Eclipse Soundscapes app in 2017, focused on their benefit for blind people. Almost nothing was written about creating a multisensory experience for sighted people.
As long as these projects are viewed as serving a relatively small population, there is a low probability of integrating sonification into experiences designed for the general public.
If most STEM content continues to be created in a visual mode and audio is considered to be a niche product, I fear that blind bloggers writing about the next solar eclipse in 2044 will observe the sudden rise of sonification projects meant only for them to experience the eclipse. I think we are missing an opportunity to incorporate sonification into exhibits meant for everyone.