Skip to main content

OPINION article

Front. Mar. Sci. , 31 January 2025

Sec. Marine Ecosystem Ecology

Volume 12 - 2025 | https://doi.org/10.3389/fmars.2025.1508851

This article is part of the Research Topic Bridging Knowledge Gaps in Marine Biological Invasions View all 7 articles

Bridging the gap between the public’s knowledge and detection of marine non-indigenous species through developing automated image classification applications for marine species

  • 1Key Laboratory of Marine Ecosystem Dynamics, Ministry of Natural Resources and Second Institute of Oceanography, Ministry of Natural Resources, Hangzhou, China
  • 2Key Laboratory of System Control and Information Processing, Institute of Image Processing and Pattern Recognition, Shanghai Jiao Tong University, Ministry of Education of China, Shanghai, China

Introduction

Biological invasions are impacting biodiversity, ecosystem, and socio-economics globally. Marine non-indigenous species (mNIS) can be introduced through human activities, such as maritime shipping and careless discarding of aquarium species. Despite significant efforts to prevent the introduction of mNIS, occurrences continue to arise, including fishes, crustaceans, ascidians, anthozoans, bryozoans, sponges, macroalgae, seagrasses and mangroves (Alidoost Salimi et al., 2021). Once mNIS establish in recipient regions, controlling and eradicating them becomes a challenging task. Early awareness of mNIS could enhance the effectiveness of early response, particularly during the introduction phase, which is critical to reduce the impacts of mNIS in the future. Therefore, it is imperative to develop reliable and cost-effective strategies for the early detection of mNIS before they successfully establish themselves in new habitats and pose threats to local biodiversity.

The public are playing important roles in marine conservation (Earp and Liconti, 2020), such as detecting and monitoring outbreaks of the starfish Acanthaster spp. (Dumas et al., 2020), and managing invasive lionfish Pterois volitans (Clements et al., 2021). To monitor the presence of mNIS, actions have been taken to help the public become familiar with and effectively recognize these species, such as using watch lists and guidebooks. However, due to the high biodiversity of marine species, identifying specimens accurately requires extensive domain knowledge and skills, which can be challenging even for experts. Furthermore, it is particularly difficult for the public to accurately identify newly introduced species that they are not familiar with, especially during the early stages of a biological invasion. Traditionally, recognizing unfamiliar marine organisms, which may be non-native, requires the access to scientific expertise, such as training, consulting illustrated guides, or searching numerous webpages, which can be time-consuming. With the development of artificial intelligence (AI), the public could turn to online resources for image search and analysis. However, achieving accurate image classification of marine species still requires professional automated tools to ensure reliable identification. Therefore, with the aim of supporting people worldwide in achieving early awareness of mNIS, this opinion highlights the achievements in developing AI-based applications for automated image classification of marine species, addresses the existing gaps in mNIS detection, and provides suggestions for potential future solutions.

Advancements in automated image classification applications for marine species

Image-based automated classification has been a long-standing interdisciplinary field of research, bridging marine biology and computer vision. In recent years, advancements in artificial intelligence, particularly deep learning techniques, have significantly propelled the development of applications for the automated classification of marine species through images (Table 1). For instance, SuperFish, a smartphone application, has been tailored to identify 38 fish species commonly found in Mauritius waters, achieving an accuracy rate of 98% (Pudaruth et al., 2021). Similarly, WikiFish, another mobile application, focuses on identifying 89 fish species prevalent in Mediterranean Sea, boasting an accuracy of 80% (Elbatsh et al., 2022). Addressing the pressing issue of fish fraud resulting from species mislabeling in the seafood industry and aquarium trade, Fishify, a mobile application has been developed. This application, trained on a dataset of over 50,000 images across 44 classes (including a separate section for non-fish images), achieves an accuracy rate of 95% (Dhore et al., 2024).

Table 1
www.frontiersin.org

Table 1. Examples of applications for automated image classification of marine species.

Fishial.AI, a project leveraging AI for fish species identification, has launched a portal at https://portal.fishial.ai, where users can submit images for automated identification and receive prompt prediction results. The built-in AI detector not only detects fish species but also creates segmentation polygons around each fish in the photo. Currently, the classification model can identify over 290 fish species by their scientific names (https://www.fishial.ai/solutions#image-cms), with accuracy rates ranging from 70% to 90% (https://docs.fishial.ai/home/features#fishial-recognition%E2%84%A2). Notably, Fishial.AI operates its model on a cloud server, accessible via an API, while keeping its code open-source to facilitate other developers in creating AI-based applications.

Moreover, to cater to the growing demand for efficient recognition of marine fish species in global oceans, FishAI, an automated web application, has been developed for hierarchical image classification at five taxonomic levels (Yang et al., 2024). Utilizing images of 808 marine fish species from the World Register of Marine Species (WoRMS), the application achieves an accuracy of 62.6% at the species level on testing images. FishAI allows user to save and share search results by providing an email address. Beyond fish, the deep learning approaches have been applied to the development of models for image classification of marine invertebrates. Examples include iBivalves for bivalves (Maravillas et al., 2023), CoralNet for corals (Chen et al., 2021), and EchoAI for echinoderms (Zhou et al., 2023). Recently, an AI-empowered citizen science approach has been introduced in biological surveys focused on marine ecological conservation at Cape Santiago on Taiwan Island. To assist citizen science participants during these surveys, a web-based application was developed to house a trained AI model specifically designed for recognizing more than 20 commonly observed species (Chen et al., 2024).

These AI-based applications have demonstrated immense power in assisting the public, including novices, divers, fishermen, and scientists, in identifying marine species. They also hold significant potential for detecting mNIS. Fon instance, in diving scenarios, divers may encounter unfamiliar highlights that are of great interest to them. These unfamiliar species could potentially be mNIS. With the help of these applications, the public can save time for navigating through numerous websites, atlases, and books, as well as comparing images to identify unknown species. The easy-to-use applications for species identification is beneficial for early detection and awareness of mNIS.

Addressing gaps and providing recommendations

Despite the development of applications mentioned above, significant gaps remain, particularly in the availability of ready-to-use automated image classification applications for marine species. These applications are crucial for assisting the public in early awareness and monitoring of mNIS.

First, the primary motivation for the development of the current applications is not focused on detecting mNIS. For instance, Fishify and WikiFish were tailored for preventing fish fraud and enhancing food safety. Users may not necessarily consider whether what they observed is mNIS or not. To engage the public, one possible way is that the developers could incorporate questions following the identification results, such as “Is it an alien species?” This may stimulate users’ curiosity. Furthermore, to help users compare the locations where they have sighted the specimens, developers could add links to biogeographic information resources. For example, the Global Biodiversity Information Facility (https://www.gbif.org/) and Ocean Biodiversity Information System (https://obis.org/) provide distribution maps displaying the locations of occurrences. These features may foster engagement of the public in detecting mNIS. By integrating biogeographical data, AI-empowered applications would conveniently assist the public in preventing the introduction and spread of mNIS.

Second, the dataset and algorithm should be addressed. When dealing with a relatively small number of categories, applications such as SuperFish on 38 fish species (Pudaruth et al., 2021), WikiFish on 89 fish species (Elbatsh et al., 2022), and iBivalves on 12 bivalve species (Maravillas et al., 2023) have achieved high classification accuracies, ranging from 80% to 98%. The number of categories identified by these applications falls short of the estimated number of species in the oceans. To enhance the AI model’s ability to accurately identify broader categories of marine species, a primary task is to establish a global, substantial dataset of labeled images for training AI architectures. For fish species, in 2021, a comprehensive vision-language benchmark dataset named Wildfish++ was released, encompassing 2,348 fish species with 103,034 images (Zhuang et al., 2021). More recently, another large-scale fish image dataset, FishNet, was created, containing 94,532 images from 17,357 aquatic species, organized according to biological taxonomy (order, family, genus, and species) (Khan et al., 2023). As the number of species included in the datasets increases, the accuracy, particularly in fine-grained classification, should be improved. For instance, FishAI and models trained on Wildfish++ and FishNet only achieve accuracies of 60-70% at the species level. Misidentification can lead to inaccurate and misleading assessments, potentially resulting in improper management decisions. To address the gap in data, the image data from the globe-scale datasets with curated labels (such as WoRMS) could be utilized for training the state-of-the-art AI architectures to develop the automated tools, as WoRMS was established with the aim of providing an authoritative and comprehensive list of names of marine organisms, and the content of WoRMS is controlled by taxonomic experts. To enhance the classification accuracy, it is recommended to increase the number of images in each category for model training, and to employ state-of-the-art architectures, including those designed for few-shot learning, to construct the classification models.

Third, imbalance exists in the development of automated image classification applications. While numerous applications have been constructed for the automated classification of fish images, significantly fewer applications have been developed for other categories of marine species. Many mNIS of considerable concern are invertebrates, algae, or plants (Alidoost Salimi et al., 2021), such as the invasive green crab Carcinus maenas (Jamieson et al., 1998), biofouling bivalve Mytilopsis leucophaeata in Europe (Verween et al., 2006), the documented biofouling mNIS in USA (Lord et al., 2015), invasive biofouling cnidarian Pennaria disticha in the Mediterranean Sea (Bosch-Belmar et al., 2022) and Tubastraea spp. (sun-coral) in the Southwest Atlantic (Coelho et al., 2022), and the invasive Echinoderms species (Ling et al., 2009; Lang et al., 2023; Ling and Keane, 2024). The good thing is that the ready-to-use applications for other marine species besides fish are increasing, such as CoralNet for corals (Chen et al., 2021) and EchoAI for Echinoderms (Zhou et al., 2023). However, there still is a lack of automated image classification applications for the underrepresented categories of mNIS. To address this imbalance, more efforts should be directed.

Last but not least, usability remains a crucial factor. Despite numerous academic publications for the advanced performance of AI models, such as those trained with datasets like Wildfish++ and FishNet, and the subsequent release of their source codes for public access, individuals lacking knowledge of deep learning or programming skills still face challenges in utilizing these models. Users are often required to configure their environments, retrain algorithms, or optimize hyperparameters, which can be daunting tasks. Similarly, although open-source and freely available computer vision software platforms, like Video and Image Analytics for Marine Environments (VIAME, viame.kitware.com), enable users to create do-it-yourself AI for analytics on image data with no programming experience, users need both the capability and the time to familiarize themselves with the platform and its workflows. Publicly accessible and easy-to-use applications, such as server-based websites and mobile applications, are crucial in bridging the gap between academic research and practical application. These applications help narrow the gap by making research findings more accessible and actionable. For mobile applications specifically, cross-platform compatibility is essential to ensure wide accessibility and usability across various Operating Systems.

Discussion

AI-based technologies are transforming the detection of mNIS. In recent years, AI-based image classification techniques have been used to address the challenges posed by mNIS, such as identifying five most common invasive fish species within Maltese coastal waters (Mifsud Scicluna et al., 2024) and protecting native Atlantic salmon from being overwhelmed by invasive Pacific salmon in Norway (https://www.huawei.com/en/tech4all/stories/saving-the-atlantic-salmon-in-norway). The real-time monitoring systems deployed in rivers, equipped with underwater cameras and automated gate systems prompted by AI, allow wild Atlantic salmon and other fish pass upstream to spawn, while filtering the invasive salmon into a holding tank for removal.

Furthermore, these AI-based applications can also benefit professionals by enabling automated verification of records submitted by the public.

In conclusion, with more images of marine species in the global oceans being collected and utilized for training, AI-based tools for automated image classification can continually be improved. Despite the persistent gaps in the public knowledge and detecting mNIS, advancing towards the creation of ready-to-use automated applications would help bridge the knowledge gaps by assisting the public in recognizing marine species, and would boost the public participation in detection and monitoring of mNIS. These robust tools could serve as cost-efficient solutions for biodiversity monitoring and conservation, such as leveraging them to develop devices for real-time detection and capture of mNIS.

Author contributions

PZ: Conceptualization, Data curation, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Writing – original draft, Writing – review & editing. X-QH: Writing – original draft. Z-YT: Writing – review & editing. DS: Writing – review & editing. C-SW: Writing – review & editing. H-BS: Writing – review & editing. XP: Writing – original draft, Writing – review & editing.

Funding

The author(s) declare that financial support was received for the research, authorship, and/or publication of this article. This work was supported by the National Key R&D Program of China (Grant no. 2022YFC2806805), the Oceanic Interdisciplinary Program of Shanghai Jiao Tong University (No. SL2022ZD108), and the Chun-Tsung Program of Shanghai Jiao Tong University (No. 2023-01-05).

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Generative AI statement

The author(s) declare that no Generative AI was used in the creation of this manuscript.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Alidoost Salimi P., Creed J. C., Esch M. M., Fenner D., Jaafar Z., Levesque J. C., et al. (2021). A review of the diversity and impact of invasive non-native species in tropical marine ecosystems. Mar. Biodivers Records 14, 11. doi: 10.1186/s41200-021-00206-8

Crossref Full Text | Google Scholar

Bosch-Belmar M., Piraino S., Sarà G. (2022). Predictive metabolic suitability maps for the thermophilic invasive hydroid pennaria disticha under future warming mediterranean sea scenarios. Front. Mar. Sci. 9. doi: 10.3389/fmars.2022.810555

Crossref Full Text | Google Scholar

Chen Q., Beijbom O., Chan S., Bouwmeester J., Kriegman D. (2021). A new deep learning engine for coralnet. Available online at: https://openaccess.thecvf.com/content/ICCV2021W/OceanVision/html/Chen_A_New_Deep_Learning_Engine_for_CoralNet_ICCVW_2021_paper.html (Accessed October 1, 2024).

Google Scholar

Chen V. Y., Lu D.-J., Han Y.-S. (2024). Hybrid intelligence for marine biodiversity: integrating citizen science with AI for enhanced intertidal conservation efforts at Cape Santiago, Taiwan. Sustainability 16, 454. doi: 10.3390/su16010454

Crossref Full Text | Google Scholar

Clements K. R., Karp P., Harris H. E., Ali F., Candelmo A., Rodríguez S. J., et al. (2021). The role of citizen science in the research and management of invasive lionfish across the western Atlantic. Diversity 13, 673. doi: 10.3390/d13120673

Crossref Full Text | Google Scholar

Coelho S. C. C., Gherardi D. F. M., Gouveia M. B., Kitahara M. V. (2022). Western boundary currents drive sun-coral (Tubastraea spp.) coastal invasion from oil platforms. Sci. Rep. 12, 5286. doi: 10.1038/s41598-022-09269-8

PubMed Abstract | Crossref Full Text | Google Scholar

Dhore M., Walunj A., Bhandari A., Dighe A., Sagri A. (2024). “Fishify: A mobile-based fish species identification app with transfer learning using MobileNetV1,” in Smart Trends in Computing and Communications. Eds. Senjyu T., So–In C., Joshi A. (Springer Nature, Singapore), 397–408. doi: 10.1007/978-981-97-1323-3_34

Crossref Full Text | Google Scholar

Dumas P., Fiat S., Durbano A., Peignon C., Mou-Tham G., Ham J., et al. (2020). Citizen Science, a promising tool for detecting and monitoring outbreaks of the crown-of-thorns starfish Acanthaster spp. Sci. Rep. 10, 291. doi: 10.1038/s41598-019-57251-8

PubMed Abstract | Crossref Full Text | Google Scholar

Earp H. S., Liconti A. (2020). “Science for the Future: The Use of Citizen Science in Marine Research and Conservation,” in YOUMARES 9 - The Oceans: Our Research, Our Future. Eds. Jungblut S., Liebich V., Bode-Dalby M. (Springer International Publishing, Cham), 1–19. doi: 10.1007/978-3-030-20389-4_1

Crossref Full Text | Google Scholar

Elbatsh K., Sokar I., Ragab S. (2022). “WikiFish: Mobile app for fish species recognition using deep convolutional neural networks,” in Proceedings of the 2021 4th International Conference on Computational Intelligence and Intelligent Systems (Association for Computing Machinery, New York, NY, USA), 13–18. doi: 10.1145/3507623.3507626

Crossref Full Text | Google Scholar

Jamieson G. S., Grosholz E. D., Armstrong D. A., Elner R. W. (1998). Potential ecological implications from the introduction of the European green crab, Carcinus maenas(Linneaus), to British Columbia, Canada, and Washington, USA. J. Natural History 32, 1587–1598. doi: 10.1080/00222939800771121

Crossref Full Text | Google Scholar

Khan F. F., Li X., Temple A. J., Elhoseiny M. (2023). “FishNet: A large-scale dataset and benchmark for fish recognition, detection, and functional trait prediction,” in 2023 IEEE/CVF International Conference on Computer Vision (ICCV). Paris, France. 20439–20449 (IEEE). doi: 10.1109/ICCV51070.2023.01874

Crossref Full Text | Google Scholar

Lang B. J., Donelson J. M., Bairos-Novak K. R., Wheeler C. R., Caballes C. F., Uthicke S., et al. (2023). Impacts of ocean warming on echinoderms: A meta-analysis. Ecol. Evol. 13, e10307. doi: 10.1002/ece3.10307

PubMed Abstract | Crossref Full Text | Google Scholar

Ling S. D., Johnson C. R., Ridgway K., Hobday A. J., Haddon M. (2009). Climate-driven range extension of a sea urchin: Inferring future trends by analysis of recent population dynamics. Global Change Biol. 15, 719–731. doi: 10.1111/j.1365-2486.2008.01734.x

Crossref Full Text | Google Scholar

Ling S. D., Keane J. P. (2024). Climate-driven invasion and incipient warnings of kelp ecosystem collapse. Nat. Commun. 15, 400. doi: 10.1038/s41467-023-44543-x

PubMed Abstract | Crossref Full Text | Google Scholar

Lord J. P., Calini J. M., Whitlatch R. B. (2015). Influence of seawater temperature and shipping on the spread and establishment of marine fouling species. Mar. Biol. 162, 2481–2492. doi: 10.1007/s00227-015-2737-2

Crossref Full Text | Google Scholar

Maravillas A. B., Feliscuzo L. S., Nogra J. A. E. (2023). Neural network approach for bivalves classification. J. Eng. Sci. Technol. 1–16.

Google Scholar

Mifsud Scicluna B., Gauci A., Deidun A. (2024). AquaVision: AI-powered marine species identification. Information 15, 437. doi: 10.3390/info15080437

Crossref Full Text | Google Scholar

Pudaruth S., Nazurally N., Appadoo C., Kishnah S., Chady F. (2021). SuperFish: A mobile application for fish species recognition using image processing techniques and deep learning. IJCDS 10, 1157–1165. doi: 10.12785/ijcds/1001104

Crossref Full Text | Google Scholar

Verween A., Vincx M., Degraer S. (2006). Growth patterns of Mytilopsis leucophaeata, an invasive biofouling bivalve in Europe. Biofouling 22, 221–231. doi: 10.1080/08927010600816401

PubMed Abstract | Crossref Full Text | Google Scholar

Yang C., Zhou P., Wang C., Fu G., Xu X., Niu Z., et al. (2024). FishAI : Automated hierarchical marine fish image classification with vision transformer. Eng. Rep., e12992. doi: 10.1002/eng2.12992

Crossref Full Text | Google Scholar

Zhou Z., Fu G.-Y., Fang Y., Yuan Y., Shen H.-B., Wang C.-S., et al. (2023). EchoAI: A deep-learning based model for classification of echinoderms in global oceans. Front. Mar. Sci. 10. doi: 10.3389/fmars.2023.1147690

Crossref Full Text | Google Scholar

Zhuang P., Wang Y., Qiao Y. (2021). Wildfish++: A comprehensive fish benchmark for multimedia research. Trans. Multi 23, 3603–3617. doi: 10.1109/TMM.2020.3028482

Crossref Full Text | Google Scholar

Keywords: non-indigenous species, biological invasions, biodiversity, public support, artificial intelligence, automated image classification

Citation: Zhou P, He X-Q, Tu Z-Y, Sun D, Wang C-S, Shen H-B and Pan X (2025) Bridging the gap between the public’s knowledge and detection of marine non-indigenous species through developing automated image classification applications for marine species. Front. Mar. Sci. 12:1508851. doi: 10.3389/fmars.2025.1508851

Received: 10 October 2024; Accepted: 21 January 2025;
Published: 31 January 2025.

Edited by:

Koebraa Peters, Cape Peninsula University of Technology, South Africa

Reviewed by:

Edson A. Vieira, Federal University of Rio Grande do Norte, Brazil

Copyright © 2025 Zhou, He, Tu, Sun, Wang, Shen and Pan. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Peng Zhou, emhvdXBlbmdAc2lvLm9yZy5jbg==; Xiaoyong Pan, MjAwOHh5cGFuQHNqdHUuZWR1LmNu

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Research integrity at Frontiers

Man ultramarathon runner in the mountains he trains at sunset

94% of researchers rate our articles as excellent or good

Learn more about the work of our research integrity team to safeguard the quality of each article we publish.


Find out more