AI learns coral reef ‘song’

summary: A new artificial intelligence algorithm trained using sounds from both healthy and degraded reefs can determine reef health 92% of the time.

Source: UCL

Coral reefs have a complex soundscape – and even experts have to painstakingly analyze the health of the reef based on sound recordings.

In a new study published in ecological indicators, The scientists trained a computer algorithm using multiple recordings of healthy and degraded reefs, which helped the machine spot the difference.

The computer then analyzed several new recordings, and successfully identified reef health 92% of the time.

The team used it to track the progress of reef restoration projects.

Lead author, PhD candidate Ben Williams (UCL Center for Biodiversity and Environment Research), who commissioned the study at the University of Exeter, said, “Coral reefs are facing many threats, including climate change, so their health monitoring and conservation projects are important. success is important.

“A major difficulty is that visual and acoustic surveys of reefs usually rely on labor-intensive methods.

“Visual surveys are also limited by the fact that many reef organisms hide themselves, or are active at night, while the complexity of reef sounds has made it difficult to identify reef health using individual recordings.

“Our approach to that problem was to use machine learning – to see if a computer could learn the song of the reef.

“Our findings suggest that a computer can pick up patterns that are undetectable to the human ear. This can tell us faster and more accurately how the reef is doing.”

Fish and other creatures living on coral reefs make a variety of sounds.

It shows fish swimming around a coral reef
Fish and other creatures living on coral reefs make a variety of sounds. credit: Tim Lamonte

The meaning of many of these calls remains unknown, but the new AI method can differentiate between healthy and unhealthy reef overall sounds.

The recordings used in the study were taken at the Mars Coral Reef Restoration Project, which is restoring heavily damaged reefs in Indonesia.

Co-author Dr Tim Lamont, from Lancaster University, said the AI ​​method creates major opportunities for improving coral reef monitoring.

“It’s a really exciting development. Sound recorders and AI could be used around the world to monitor the health of reefs, and find out if efforts to protect and restore them are working.” are,” said Dr Lamont.

“In many cases it is easier and cheaper to deploy an underwater hydrophone to a reef and leave it there, especially in remote locations – than to repeatedly visit the reef to expert divers in remote locations. “

Financing: The study was funded by the Natural Environment Research Council and the Swiss National Science Foundation.

About this artificial intelligence research news

Author: Henry Kilworth
Source: UCL
contact: Henry Kilworth – UCLA
image: Image credits to Tim Lamonte

Basic Research: open access.
,Enhancing the automated analysis of marine sounds using ecological acoustic indices and machine learning“By Ben Williams et al. ecological indicators


essence

Enhancing the automated analysis of marine sounds using ecological acoustic indices and machine learning

see all

It Shows Smart Pillow

Historically, ecological monitoring of marine habitats has relied primarily on labour-intensive, non-automated survey methods. The field of Passive Acoustic Surveillance (PAM) has demonstrated the potential of this exercise to automate surveys in marine habitats. This has been done mainly through the use of the ‘Ecoacoustic Index’ to measure characteristics from natural sounds.

However, investigations using different indices have met with mixed success.

Using PAM recordings collected in one of the world’s largest coral reef restoration programs, we instead apply a machine-learning approach to a suite of ecoacoustic indices to improve the predictive power of ecosystem health. Huh. Healthy and degraded reef sites were identified through live coral cover surveys, with 90–95% and 0–20% cover, respectively.

A library of one minute recordings was extracted from each. Twelve ecological indices were calculated for each recording, up to three different frequency bandwidths (low: 0.05–0.8 kHz, medium: 2–7 kHz and wide: 0.05–20 kHz). Twelve of these 33 index-frequency combinations differed significantly between healthy and degraded habitats.

However, the best-performing single index could correctly classify only 47% of the recordings, requiring extensive sampling from each site to be useful.

We therefore trained a routine discriminant analysis machine-learning algorithm to discriminate between healthy and degraded sites using an optimized combination of ecological acoustic indices.

This multi-index approach discriminated between these two habitat classes with better accuracy than any single index in isolation. The pooled classification rate of 1000 cross-validated iterations of the model had a 91.7% 0.8, mean SE) success rate on correctly classifying individual recordings.

The model was subsequently used to classify recordings from two actively restored sites, established 24 months before recording, with coral cover values ​​of 79.1% (± 3.9) and 66.5% (± 3.8). was. Of these recordings, 37/38 and 33/39 received classification as healthy, respectively.

The model was also used to classify recordings from a newly restored site established <12 months ago with a coral cover of 25.6% (±2.6), of which 27/33 recordings were classified as degraded .

This investigation highlights the value of combining PAM recording with machine-learning analysis for ecological monitoring and demonstrates the potential of PAM to monitor reef recovery over time, allowing for labor-intensive in-water surveys by experts. dependency is reduced.

As access to PAM recorders is advancing rapidly, effective automated analysis will be required to keep pace with these expanding acoustic datasets.

Author: Admin

Leave a Reply

Your email address will not be published.