Google has launched a new tool that utilizes smart technology to aid marine biologists in understanding marine environments of coral reefs and evaluating their health, contributing to the preservation of these important reefs.
The company has trained using the SurfPerch tool by analyzing thousands of audio recordings of marine life in coral reefs.
This tool provides researchers studying coral reefs the opportunity to listen to the health of these reefs from within, monitor their activity during the night, and also track coral reefs in deep waters.
The project began as a public engagement effort to listen to the sound of coral reefs online.
During the past year, visitors to the Calling in Corals website were able to listen to over 400 hours of audio recordings of coral reefs from different locations around the world. They were asked to identify the location of fish when hearing them in the recordings.
This led to the creation of an important audio dataset focusing on the health of coral reefs.
By leveraging external sources, Google succeeded in creating a new audio library for fish and used this library in developing the artificial intelligence tool SurfPerch.
It is now easy to quickly train SurfPerch to detect any new sound emitted from coral reefs.
Thanks to this new technology developed by Google, we can now analyze data more efficiently without the need for expensive equipment. This opens up new avenues for studying and protecting coral reef communities and preserving them.
Joining this effort are Professor Steve Simpson, who studies marine biology at the University of Bristol in the UK, and Dr. Ben Williams, who studies marine biology at King’s College London. The two focus their research on coral reef ecosystems, the impact of climate change on them, and ways to restore them.
In addition, researchers found the potential to improve the performance of the SurfPerch model by using bird recordings.
Similarities were found between bird sounds and fish sounds despite clear differences between them, and scientists noted common patterns between them. The model effectively learned from these different sounds.
After integrating Calling Our Corals data with SurfPerch in initial experiments, researchers were able to identify differences between protected and unprotected coral reefs in the Philippines, track recovery outcomes in Indonesia, and understand relationships with fish communities in the Great Barrier Reef.
The project continues to enhance the Calling in Our Corals site by adding new sounds, contributing to the education and development of artificial intelligence as stated by Google.