The sound of coral reefs signals its health

DAILY SCIENCE

Artificial intelligence can track a coral reef’s health by learning its “song”

Researchers demonstrate how to tell damaged reefs from healthy ones using relatively cheap underwater recorders paired with new computer programs.
June 1, 2022

A healthy reef is a noisy place. Fish whoop, croak and growl. Shrimp clack their claws, emitting firecracker-like snaps. Even seaweed makes noise as it photosynthesizes, releasing tiny oxygen-filled bubbles.

Much like a doctor with a stethoscope, scientists are beginning to listen to these sounds to help gauge a coral reef’s health. Now, a team working in Indonesia has paired insights about these sounds with new computer and audio-recording equipment. Their work holds the possibility of creating swifter and cheaper reef health-checks.

“Our findings show that a computer can pick up patterns that are undetectable to the human ear,” said Ben Williams, a marine researcher at the University College London who was part of the study. “It can tell us faster, and more accurately, how the reef is doing.”

The health of coral reefs is an urgent issue. Home to a quarter of all marine life, these ecosystems are battered by underwater heatwaves, destructive fishing techniques, pollution, and changes in ocean acidity linked to greenhouse gases. Scientists warn that rising ocean temperatures due to global warming could wipe out virtually all coral reefs this century.

Tracking the condition of individual reefs is tricky and laborious. Often, people need to swim underwater and gather information about the organisms they see. Technology such as airborne and underwater drones are being developed. Last year, researchers unveiled a satellite-based system to detect when coral reefs around the globe bleach. That’s when overheated coral polyps expel the colorful algae living inside them, leaving the coral bone-white and starving. But such systems are costly and—with the exception of satellites—cover limited areas.

A team of scientists from the United Kingdom and Indonesia, working to restore reefs near the Indonesian island of Sulawesi, wanted to see if they could simplify this monitoring work using relatively cheap underwater sound recorders paired with new computer programs.

In 2018, the researchers placed sound recorders at seven different reefs: two that were health, two degraded by coral mining and fishing with dynamite, two where restoration work was less than a year old, and two where restoration started more than two years earlier. Those devices were set to collect one-hour recordings at a variety of times to capture the different sounds that might emerge: around the new moon and full moon, and during the day, twilight, and midnight.

They took these 262 recordings and split the sounds into three different sets of frequencies ranging from relatively low to high-pitched sounds. Then they screened each set using 12 measurements identified from previous studies of underwater soundscapes. Those included the diversity of sounds at different frequencies; the variation in intensities of different sounds over time; the randomness of sounds; and the rate of snapping sounds from shrimp.

They picked the eight measurements that differed most between healthy and damaged reefs. No single metric was particularly skilled at singling out whether a reef was in good shape. The most accurate was no better than a flip of a coin.

Then, the researchers turned to a more complex computer program—a type of so-called machine learning in which a computer can identify patterns in oceans of data and increase its accuracy over time.

In this case, the program found combinations of the eight different sound characteristics that corresponded to healthy and damaged reefs. More than 90% of the time, it could accurately detect whether a particular recording came from a healthy or a severely damaged site. Sound clips from mature restoration sites scored as “healthy” more than 80% of the time, while the computer labeled sounds from reefs at the start of restoration as damaged more than 80% of the time, according to results published last week in the journal Ecological Indicators.

It’s not clear which organisms and sounds were key to these distinctions. The snapping shrimp didn’t help tell damaged from healthy sites, so their snapping rate wasn’t in the final computer algorithm. Other measurements weren’t linked to a particular coral-dweller.

The findings suggest that listening to the symphony of the reef could help scientists track the fate of these critical habitats and provide clues about whether restoration projects are really working, said Tim Lamont, a marine biologist at Lancaster University who was part of the research.

“In many cases it’s easier and cheaper to deploy an underwater hydrophone on a reef and leave it there than to have expert divers visiting the reef repeatedly to survey it,” Lamont said. “Especially in remote locations.”

There’s no sign yet that coral reefs have their own Spotify playlist.


More on coral reefs and bioacoustics:


Williams, et. al. “Enhancing automated analysis of marine soundscapes using ecoacoustic indices and machine learning.” Ecological Indicators. May 20, 2022.

Image: A researcher deploys a hydrophone on a coral reef in Sulawesi, Indonesia. ©Tim Lamont/University of Exeter.

What to Read Next

NewsMatch will double your donation

This is absolutely the best time to become an Anthropocene member!

Science-based  •  Nonprofit  •  Reader-funded. 

Yes, Count Me In!

You have Successfully Subscribed!

Anthropocene Magazine Logo

Get the latest sustainability science delivered to your inbox every week

Newsletters

You have successfully signed up

Share This