AI technology is revolutionizing marine research by decoding ocean sounds to detect hidden marine life. By analyzing acoustic signals, AI helps scientists identify species, track movements, and understand underwater ecosystems. This breakthrough enhances conservation efforts and provides valuable insights into marine biodiversity. As AI continues to advance, it offers new ways to explore and protect ocean life, making it a powerful tool for marine biologists and environmentalists worldwide.
Studying coral reefs once involved hours of tedious manual analysis, but artificial intelligence is transforming the process.
A new neural network can now analyze ocean sounds instantly, identifying fish activity 25 times faster than human researchers. This breakthrough could revolutionize how scientists monitor reef health and safeguard marine ecosystems.
Coral reefs are some of the most biodiverse ecosystems on Earth. Although they occupy less than 1% of the ocean, they provide a habitat for around 25% of marine species during different stages of their life cycles. With such immense biodiversity in one area, scientists face challenges in accurately identifying the species present and their populations.
To address this, researchers at the Woods Hole Oceanographic Institution have developed a new technique, combining acoustic monitoring with a neural network to study fish activity based on sound. Their findings were published today (March 11) in JASA, the journal of the Acoustical Society of America, through AIP Publishing.
For years, scientists have used passive acoustic monitoring to study coral reefs, which involves placing underwater recorders at reefs to capture ambient sounds over months. While existing signal processing tools can handle large audio datasets, they aren’t equipped to detect specific sounds. Identifying individual fish calls or species-specific noises still requires researchers to manually sift through hours of recordings.
"But for those doing it, it’s grueling work," said author Seth McCammon. "It's incredibly tedious and miserable."
Manual analysis is also too slow for practical applications. With many of the world’s coral reefs threatened by climate change and human activities, it’s crucial to quickly identify and track changes in reef populations for conservation efforts.
“It can take years for humans to analyze this data," McCammon explained. "Analyzing it manually is not scalable."
In response, the researchers trained a neural network to automatically process vast amounts of acoustic data, analyzing audio in real time. Their algorithm matches the accuracy of human experts but does so more than 25 times faster, potentially transforming ocean monitoring and research.
"Now that we don’t need humans involved, what other kinds of devices could we use?" McCammon asked. "My co-author Aran Mooney is exploring integrating this neural network onto a floating mooring to provide real-time updates on fish call counts. We are also testing it on our autonomous underwater vehicle, CUREE, to detect fish calls and map areas with biological activity.”
This technology could solve a longstanding issue in marine acoustic research: identifying which fish make which sounds.
"With most species, we can’t yet say for sure that a call came from a specific fish species," said McCammon. "That’s the 'holy grail' we’re striving for. By detecting fish calls in real time, we can build devices that automatically hear a call and identify which fish are nearby."
In the future, McCammon hopes this neural network will enable researchers to monitor fish populations in real time, pinpoint species in need of attention, and quickly respond to disasters. This technology could provide conservationists with a clearer understanding of coral reef health in an era when reefs are facing increasing threats.
For questions or comments write to writers@bostonbrandmedia.com
Source: scitechdaily