Listen to Music Made From Yellowstone’s Seismic Data
A scientist and a musician performed a live musical rendition of the park’s underground rumblings
Yellowstone National Park is one of the most seismically active places in the United States, yet most people never get to hear the movement of the Earth’s tectonic plates shifting far below the surface.
Now, a scientist and a musician have collaborated to change that by turning the park’s seismic readings into music. The duo behind the innovative Yellowstone score consists of Domenico Vicinanza, a particle physicist and composer at Anglia Ruskin University, and Alyssa Schwartz, a flutist and musicologist at Fairmont State University.
Vicinanza has developed a computer program that can translate the park’s underground vibrations into sheet music, a process known as data sonification. Last week, Schwartz played those notes in real time at the 2023 Internet2 Community Exchange conference in Atlanta.
During the presentation, the pair explained how “technology-enabled music creation can convey Yellowstone’s unique geophysical systems to raise awareness about geology and geophysics,” with a special emphasis on how the music could help those with visual impairments experience the country’s oldest national park, according to the conference’s website.
The data come from a seismograph located near the middle of the park. Vicinanza chose to use the tremors recorded at that site because of their variability.
“Nearly 50 percent of the earthquakes [at Yellowstone] occur in swarms that cluster together,” says Vicinanza to the Guardian’s Nicola Davis. “So it’s a fantastic playground for any kind of scientist that is interested in seismology, geophysics, mechanics or, like me, data science and music.”
For Schwartz, the live performance in Atlanta last week presented two main challenges: playing a piece of music for the very first time in front of an audience, and figuring out where among the nonstop notes to take a breath. Since the Earth’s plates are nearly always moving, the data doesn’t leave room for any pauses for human breathing, per the Washington Post’s Leo Sands.
“This was a whole new type of artistry, actually,” she tells NPR’s Ayesha Rascoe. “I’ve never been in another situation where I was asked to sight read and then also add my own flair on the spot. So it was a very unique experience.”
This isn’t Vicinanza’s first experience bringing science and art together. He’s also made music using data from the Large Hadron Collider particle accelerator near Geneva, Switzerland, as well as data collected at the moment NASA’s Voyager 1 crossed over into interstellar space.
Next up, the duo wants to perform a live duet with a whale by capturing the creature’s underwater sounds with a hydrophone. Enjoying music created from data can help listeners develop a “mental image of what’s happening” with the world around them, as Vicinanza tells the Washington Post, and then, ideally, become interested in science more broadly. Sonification may also be able to help scientists with their work, too.
“We can listen to patterns, we can identify patterns,” says Vicinanza to Forbes’ Eva Amsen. “Our ears are so good at doing that.”