- Laboratoire de Technologie et de Biologie Halieutique, Institut Agro, Dynamique et durabilité des écosystèmes (DECOD) (Ecosystem Dynamics and Sustainability), L'Institut français de recherche pour l'exploitation de la mer (IFREMER), l'Institut National de Recherche pour l'Agriculture, l'Alimentation et l'Environnement (INRAE), Lorient, France
Through the advancement of observation systems, our vision has far extended its reach into the world of fishes, and how they interact with fishing gears—breaking through physical boundaries and visually adapting to challenging conditions in marine environments. As marine sciences step into the era of artificial intelligence (AI), deep learning models now provide tools for researchers to process a large amount of imagery data (i.e., image sequence, video) on fish behavior in a more time-efficient and cost-effective manner. The latest AI models to detect fish and categorize species are now reaching human-like accuracy. Nevertheless, robust tools to track fish movements in situ are under development and primarily focused on tropical species. Data to accurately interpret fish interactions with fishing gears is still lacking, especially for temperate fishes. At the same time, this is an essential step for selectivity studies to advance and integrate AI methods in assessing the effectiveness of modified gears. We here conduct a bibliometric analysis to review the recent advances and applications of AI in automated tools for fish tracking, classification, and behavior recognition, highlighting how they may ultimately help improve gear selectivity. We further show how transforming external stimuli that influence fish behavior, such as sensory cues and gears as background, into interpretable features that models learn to distinguish remains challenging. By presenting the recent advances in AI on fish behavior applied to fishing gear improvements (e.g., Long Short-Term Memory (LSTM), Generative Adversarial Network (GAN), coupled networks), we discuss the advances, potential and limits of AI to help meet the demands of fishing policies and sustainable goals, as scientists and developers continue to collaborate in building the database needed to train deep learning models.
1 Introduction
In observing fishes, the human eye can efficiently distinguish swimming movements, where the fish is, how it is swimming, how it is interacting with other fishes and its environment (He, 2010). For ethologists, interpreting behaviors from visual observations come almost instantaneously. As developments of non-invasive and autonomous underwater video cameras continue to advance (Graham et al., 2004; Moustahfid et al., 2020), behavioral observations can now be derived from a plethora of high-resolution marine imagery and videos (Logares et al., 2021). The reach of human vision continues to extend as cameras can be used in most conditions (Shafait et al., 2016; Christensen et al., 2018; Jalal et al., 2020), such as light, dark and muddy underwater conditions, and can go to greater depth and longer periods (Torres et al., 2020; Bilodeau et al., 2022; Xia et al., 2022). Cameras can now provide vision in 2D or 3D into how fishes interact with fishing gears used to capture marine species (e.g., pots, lines, trawls and nets) where behavior can be recorded by an observation system. It allowed direct vision on how gear components affect catches and escapements (Graham, 2003; Nian et al., 2013; Rosen et al., 2013; Williams et al., 2016; Langlois et al., 2020; Sokolova et al., 2021; Lomeli et al., 2021) and has opened windows to observe behaviors of fishes in any kind of environmental condition (Robert et al., 2020; Cuende et al., 2022).
This marked an important step to capture finer details in the process of fishing gear selectivity (i.e., the gear’s ability to retain only targeted species, while avoiding bycatch of vulnerable, unwanted species or undersized individuals). Innovations in gear selectivity continue to bring in new types of selection and bycatch reduction devices added to gear designs (e.g., for review of selective and bycatch reductions devices, see Vogel, 2016; Matt et al., 2021; for grid, see Brinkhof et al., 2020, for mesh size: Kim et al., 2008; Aydin and Tosunòlu, 2010; Cuende et al., 2020b; Cuende et al., 2022, for panels: Bullough et al., 2007; Ferro et al., 2007). By observing the influence of these modifications, finer selectivity patterns have been unraveled, highlighting how the visual, hearing and tactile cues that species are sensitive to are key in the capture process of fishes (Arimoto et al., 2010; Yan et al., 2010). As studies in fish vision show differences in behavior across species in relation to their spectral sensitivity (Goldsmith and Fernandez, 1968; Carleton et al., 2020), gears continue to be developed with visual components, such as light and color, that aim to make them more or less detectable (Ellis and Stadler, 2005; Sarriá et al., 2009; Underwood et al., 2021). Mesh and panel configurations affect tactile cues and herding behavior that can differ among species (Ryer et al., 2006). Thus, they are continually being tested across different fishing zones (Ferro et al., 2007; Cuende et al., 2020a) as environmental conditions such as depth and light penetration change fish behavior (Blaxter, 1988). Observations of how visual, acoustic, or mechanosensory stimuli elicit fish movement have been extensively studied (e.g., Forlim and Pinto, 2014; Popper and Hawkins, 2019; Xu et al., 2022). Quantifying reactions of fishes to stimuli or gear modifications requires an assessment of their swimming patterns that are highly variable and nonlinear as they are under stress, in constant locomotion (Kim and Wardle, 2003; Kim and Wardle, 2005) and are affected by several environmental factors (Schwarz, 1985; Baatrup, 2009; Yu et al., 2021; Xu et al., 2022). Moreover, their movement often differ between individual and group behavior (Viscido et al., 2004; Stienessen and Parrish, 2013; Harpaz et al., 2017; Schaerf et al., 2017).
As of today, automated tools in fish recognition have been mostly driven by economical frameworks such as in monitoring their welfare on fish farms. (Zhou et al., 2017; Muñoz-Benavent et al., 2018; Cheng et al., 2019; Måløy et al., 2019; Bekkozhayeva et al., 2021; X. Yang et al., 2020), in directing migratory trajectories in river passageways (Stuart et al., 2008; Cooke et al., 2020; Eickholt et al., 2020; Jones and Hale, 2020) and stock assessments (Mellody, 2015; Myrum et al., 2019; Connolly et al., 2021; Ovchinnikova et al., 2021). Artificial Intelligence (AI) has thus become a multi-purpose data processing tool in marine science that is integrated in model simulations, predictions of physical and ecological events (Chen et al., 2013) and imagery data processing from large-scale to fine-scale observations (Beyan and Browman, 2020). Yet, observations are often focused on the temporal aspects of swimming behavior on a 2D-scale (Lee et al., 2004; G. Wang et al., 2021) with lack of spatial depth and 3D components of the real world, providing only a narrow window of their actual behavior as a whole. These movements and their complexity need to be transformed into meaningful metrics derived from video observations (Aguzzi et al., 2015; Pereira et al., 2020). This requires a tremendous amount of time, focus, effort and is subject to error and incomplete manual observations (Huang et al., 2015; Guidi et al., 2020). This is where AI methods enter (Packard et al., 2021): the principle is to translate what the human eye sees and what the brain interprets into computer vision (or machine vision) and artificial neural networks (van Gerven and Bohte, 2017; Boyun et al., 2019). For computer vision, images of fishes and their corresponding features (temporal and spatial) must therefore be translated to numerical units that the computer can understand (Aguzzi et al., 2015).
Studies and innovations on fish observations over the past decade have successfully generated models that can automatically see fishes on videos, identify taxa and follow their swimming direction with considerable accuracy (Hsiao et al., 2014; Nasreddine and Benzinou, 2015; Ravanbakhsh et al., 2015; Boudhane and Nsiri, 2016; Qin et al., 2016; Marini et al., 2018; Xu and Matzner, 2018; Salman et al., 2019; Cai et al., 2020; Cui et al., 2020; Jalal et al., 2020; Raza and Hong, 2020; Yuan et al., 2020; Ben Tamou et al., 2021; Cao et al., 2021; Crescitelli et al., 2021; Li et al., 2021; Lopez-Marcano et al., 2021; Knausgård et al., 2021). Despite recent advancements, it remains challenging to train existing AI models (e.g., Convolutional Neural Network, CNN; Faster Recurrent CNN, Faster RCNN; Residual Network, ResNet; Long Short-Term Memory, LSTM; Convolutional 3-dimensional network, C3D, etc.) that could recognize fish behaviors from their swimming movements in 3D (Li et al., 2022) given the myriad of variability occurring at sea (Christensen et al., 2018). Artificial Intelligence may help to further improve the sustainability of fishing as the classical selective studies are reaching a plateau due to bottleneck in data collection inherent to the challenge of obtaining direct, in situ observations.
This paper addresses common stimuli that trigger fish reactions from selective devices in fishing gears and how these behavioral responses are transformable into quantifiable metrics with selectivity modeling and classification methods that can be pipelined in AI methods (Section 2). Section 3 presents current state and limitations of AI applied to fish gear interactions through a bibliometric analysis and the recent developments in automatic behavior recognition. The fourth section addresses the hurdles of observing interactions of fishes across fishing gear selectivity studies and how AI methods may help face these challenges.
2 Observing stimuli-response in fishing gears: The teaching base of AI models for behavior recognition
“Researchers now realised that, like the rest of the vertebrate kingdom, fishes exhibit a rich array of sophisticated behaviour and that learning plays a pivotal role in behavioural development of fishes. Gone, or at least redundant, are the days where fishes were looked down upon as pea-brained machines whose only behavioural flexibility was severely curtailed by their infamous 3-second memory” (Brown et al., 2006)
2.1 Observations of fish behavior in fishing gears
Early testing, through manual counting, size measurement, and quantification of catches/retention, has paved the way for selective devices and gear modifications to be integrated in the design of commercial fishing gear. Mesh modifications were suggested through empirical approaches by studying catch retention (e.g., catch comparison or covered codend methods) (Dealteris and Reifsteck, 1993; Ordines et al., 2006; Aydin and Tosunòlu, 2010b; Anders et al., 2017b), tank experiments for manual observations of fish passing through meshes (Glass et al., 1993; Glass et al., 1995) and even numerical approaches which estimates catches a posteriori (e.g., SELECT; Fonseca et al., 2005). Optic and sonar imaging rapidly came into play to directly estimate catches during capture (Silicon Intensified Target, SIT camera system, Krag et al., 2009; acoustic imaging, Ferro et al., 2007), then applied to observe species behavior in gears (Mortensen et al., 2017). Over the years, observing fishes became achievable in various conditions with the breadth of available technology that can be autonomously deployed for ecological and fisheries monitoring (Durden et al., 2016; Moustahfid et al., 2020). Example of technological solution to observe behavior in real world condition are presented in Table 1.
Interesting behaviors from fishes have since been unearthed such as anti-predatory responses (Rieucau et al., 2014), encounters of fish with nets (Jones et al., 2008; Rudstam et al., 2011), differences in swimming speed (He, 1993; Breen et al., 2004; Spangler and Collins, 2011), avoidance (de Robertis and Handegard, 2013), exhaustion (Krag et al., 2009), orientation (Odling-Smee and Braithwaite, 2003; Holbrook and Perera, 2009; Haro et al., 2020), escapement (Glass et al., 1993; Mandralis et al., 2021), herding behavior (Ryer et al., 2006), and unique social behaviors (Anders et al., 2017a) from which selectivity studies in gears are based on. Knowledge of fish reaction and escape behavior has thus grown, leading to the development of novel gears with more open meshes, careful placement of sorting grids, and other devices to improve both size and species selectivity (Stewart, 2001; Watson and Kerstetter, 2006; Vogel, 2016; O’Neill et al., 2019). Gear selectivity might also be improved by triggering active species responses, using light, sound, and physical stimuli (O’Neill and Mutch, 2017).
2.2 Current observations of fish stimuli-response
2.2.1 Responses to light and color stimuli
Fish responses to light has been mainly studied in controlled environments and in aquaculture. It is challenging to observe light responses at sea as light attenuation limits the direct observations of fish behavior. The response to light—i.e., phototaxis—can improve gear selectivity as fishes greatly depend on vision for sensory information (Guthrie, 1986). Depending on the species and the development stage (Kunz, 2006), fishes can exhibit either positive (swimming towards light source) or negative phototaxis (swimming away) to different wavelength and intensities of light (Raymond and Widder, 2007; Underwood et al., 2021). Thus, artificial illumination is taking considerable attention for behavioral guidance of fishes to dissuade fishes from entering the gear (Larsen et al., 2017), or to help them escape from within (Southworth et al., 2020). Illumination in gears take either the form of LED light installments (e.g., illuminated escape rings for non-targeted species, Watson, 2013; illuminated separation grids for ground fishes, O’Neill et al., 2018b) or with glow-in-the-dark netting material (Karlsen et al., 2021). In dark environments, near-infrared light or red light is usually used to observe the behaviors of fishes instead of white light that may disrupt behaviors of fishes (Widder et al., 2005; Raymond and Widder, 2007; Underwood et al., 2021).
Responses of fish to color also play an important part as most bony fishes are tetrachromatic, allowing them to see colors more vividly than humans (Bowmaker and Kunz, 1987). Some fishes may be more visually sensitive to certain kinds of light wavelength and intensity (Lomeli and Wakefield, 2019), other may be non-responsive (Underwood et al., 2021). Researchers thus use these species-selective traits to install light devices (LED lights, infrared light, laser beams) on gears or change the color of the fishing nets (white, transparent, black) depending on the selected species (Simon et al., 2020; Méhault et al., 2022)
2.2.2 Responses to acoustic stimuli
Sound has been long used by fishers to scare fishes and gather them for bottom trawling. Yet, the response to sound—i.e., phonotaxis—can also be used for selectivity as hearing species are generally sensitive to specific frequencies (Dijkgraaf, 1960). Selectivity studies typically observe negative phonotaxis (i.e., avoidance) triggered by low-frequency sound (Schwarz and Greer, 2011), which can be displayed by fishes in different ways (Popper and Carlson, 1998; de Robertis and Handegard, 2013). Similar to light responses, some fishes tend to be more sensitive to certain sound frequencies, some are called “hearing specialists” such as Atlantic herring and cod (Chapman and Hawkins, 1973; Doksæter et al., 2012; Pieniazek et al., 2020). O’Neill et al. (2019) also suggested that passive acoustic approaches with sound reflectors can be designed with gears to make them more detectable for echo-locating species (He, 2010). Mainly, sound and light added to fishing gears can help attract the targeted species and help deter vulnerable or harmful animals such as mammals or fish predators (Putland and Mensinger, 2019; Lucas and Berggren, 2022). Although fishing techniques with sound have been in practice since a while (He, 2010), exploration for species selective sound devices are still at its early stages.
2.2.3 Responses to physical stimuli
The response to physical contact—i.e., thigmotaxis—shows the tendency of fishes to remain close to the seabed, or the lateral structure of gears (Millot et al., 2009). This behavior can be utilized to modify mechanical structures and panels in gears. Physical stimuli can play an important role for allowing fishes to escape (Mandralis et al., 2021) or be sorted (Larsen and Larsen, 1993; Brinkhof et al., 2020). These are usually installed in or on the gears after a series of behavioral trials on fish responses to different configurations (Santos et al., 2016). Physical stimuli are thus often drawn from the species-specific behavior (Ferro et al., 2007; Cuende et al., 2020a).
Fishes tend to orient themselves to face the water flow to hold a stationary position and lower the amount of energy they spend; this is called rheotaxis (Painter, 2021). The directional behavior due to water flow may be used to improve selectivity in trawls. For example, veil nets on shrimp fishery can modify the flow within gears, directing fishes to selective grids and net structures (Graham, 2003) and water jets projecting downward of forward can elicit early avoidance from fishes about to enter the gear (Jordan et al., 2013).
2.2.4 Other stimuli and combination of stimuli
Other stimuli relating to chemical responses (chemotaxis; Løkkeborg, 1990) and electrosensory responses (i.e., electrotaxis; Sharber et al., 1994; O’Connell et al., 2014) in fishes still need to undergo trials. Chemotaxis, which fishes use for foraging, may help fishes acquire information from greater distances (Weissburg, 2016) and are used in baited fisheries (Rose et al., 2005). Electrotaxis that elasmobranchs use to detect weak electromagnetic signals is exploited in longline fishing to reduce bycatch with electropositive metals and magnets (Kaimmer and Stoner, 2008; Robbins et al., 2011; O’Connell et al., 2014). Combination of multiple stimuli such as acoustic and visual signals also promote different responses from fishes, enhancing or impeding the responses to other cues (Lukas et al., 2021). Overall, understanding multi-sensory modalities of marine animals may help adjust selective devices, reducing bycatch and focusing catches to targeted species (Walsh et al., 2004; Jordan et al., 2013).
2.3 AI application to fish stimuli
Studying fish responses to stimuli require empirical studies, which are often limited in terms of replicates due to logistical constraints and temporal demand to collect and process raw data. Stimuli have thus been studied manually, since automatization remains difficult to apply to in situ conditions due to heterogeneous, moving background and environmental conditions. Manual observations of stimuli response currently provide the reference point for behavior recognition which now faces more and more data to process from continued observations at sea. Applying AI models may ease the data processing and enable to exploit larger amount of data. As opposed to traditional tracking method applicable to controlled experiments (e.g., background subtraction and Kalman filters, Simon et al., 2020), deep learning models are less sensitive and may be applied to harsher conditions (Sokolova et al., 2021). Computer vision can also be improved by selecting the observation system the most appropriate to produce imagery data for the fishing gear used; the variety of systems and data processing approaches for stimuli is presented in Table 2.
Table 2 Examples of fish behavior studies exploring species’ responses to stimuli using AI and their application on fisheries.
3 Artificial Intelligence for fish behavior applications
3.1 Bibliometric analysis
3.1.1 Bibliometric analysis methods
A bibliographic research was done in February 2022 on SCOPUS for scientific journals on 2 sets of 5 queries (Figure 1). Each of the query of the first set (256 articles) included AI-related keywords. The queries linked to the AI keywords were selected to obtain studies that focus on fish behavior, underwater observations, fishing gears, and in ecological studies. The second set had the same keywords as the first set but included keywords for both saltwater and freshwater ecosystems to exclude automatic detection and classification of fish species done onboard fishing vessels with the use of keywords in all the 5 queries. This narrowed down the number of extracted publications to 138 articles (Figure 1). However, both sets of publications still included studies not relevant to the topic, so a manual screening was undertaken. The screening was done one by one among the extracted studies to keep only the relevant studies which were cross analyzed with other pertinent studies that have not been included in the SCOPUS results but are mentioned in this review. The studies that were removed from the list focused on topology mapping, stock assessment, climatological studies, biochemical studies, and automatic identification for other marine fauna and flora such as sea cucumber and algae. A final list of 384 relevant studies was collected and reviewed to extract the studies with automated fish detection, counting, species classification, motion tracking and behavior recognition with deep learning models in underwater systems.
Figure 1 Visualization of bibliographic search. Top photo: Set of queries in SCOPUS and number of resulting articles. Fish*W/2 ecology keyword was used to focus the search on ecologically-based studies. Bottom photo: Bibliometric landscape of topics from articles (Linkage of keywords, occurrence > 5).
3.1.2 Bibliometric analysis results
The gathered studies show that the automation of tasks such as fish detection, species classification, fish counting, fish tracking, and behavior recognition is progressively materializing in the 21st century (Figure 2). The onset of ecological studies of fishes based on AI and computer vision has surfaced in the past 10 years (87 publications in relation to fish detection and classification; 36 in relation to fish behavior recognition extracted from bibliography search in SCOPUS). Developments are still on their early stages but are gaining attention rapidly, particularly for automatic detection and classification techniques thanks to the rise of deep learning (LeCun et al., 2015). Studies are fewer for automatic motion tracking of fishes and behavior recognition compared to detection and classification studies as they build on the AI methods of the latter and require more complex processing. While fish detection is being widely applied in marine habitats for several years (Fisher et al., 2016), automatic tracking and behavior recognition of fishes during capture process has yet to be applied. The following sections expand the results from the bibliometric analysis and give a brief explanation of AI and examples on the current applications of behavior recognition that can be transferred to selectivity studies.
Figure 2 Number of publications between 1989 and 2022 for the 3 categories. The number of publications in all categories is from the cross-analysis between bibliographic search in SCOPUS and manual search in both Google Scholar and Web of Science. The final list includes 388 relevant articles reviewed one by one and categorized by the authors according to the methods included in each study.
3.2 Introduction to Artificial Intelligence
As current observations of fish behaviors in fishing gears now step into the era of AI and deep learning along with other domains in marine science (Malde et al., 2020; Logares et al., 2021; Packard et al., 2021), Internet of Underwater Things (IoUT) and Big Data coupled to AI will inevitably revolutionize the field (Jahanbakht et al., 2021). Today, behavioral studies in fisheries science stand on top of highly evolving tools to automatize analysis and processing of data. They are curated from interdisciplinary fields among marine science, computer science, neuroscience, and mechanical science among many other disciplines that are now coagulating because of AI (Xu et al., 2021). Some useful references for AI in marine science and reviews can be found in Beyan and Browman (2020); Malde et al. (2020) and Goodwin et al. (2021).
In marine sciences, neural networks used for object detection are usually “supervised” (Cunningham et al., 2008), meaning that they are trained using ground-truth objects, manually located in images, and classified into pre-defined classes. These objects, defined using the four coordinates of their bounding boxes and their associated classes (see Figure 3 for examples of bounding boxes), are then used to train the model to localize and classify these target objects within new images. Indeed, objects are assigned to one or several categories based on the probability of belonging to each of the classes used to train the model (Pang et al., 2017; Ciaparrone et al., 2020). Once object detection is done on different frames (Figure 4E, F), the tracking model pairs the bounding boxes among frames to reconstruct the track of each object through time and space (Belmouhcine et al., 2021; Park et al., 2021). During the training, if the model can predict classes and bounding boxes that match the groundtruth validation data with a minor error, depending on the given parameters, it can be considered an accurate model. However, if the model has poor predictive performances, then the learning continues.
Figure 3 Examples of bounding boxes of fishes. Top panel: Tracking of fishes on the open-source VIAME platform for image and video analysis (Dawkins et al., 2017). Bottom left: multiple trajectories of black seabreams around a fixed bait. Bottom right: In situ detections of sardines and horse mackerel inside a gear (Game of Trawls Project).
Broadly speaking, images are streamlined into computer algorithms to extract information. These algorithms contain artificial neural networks that apply a sequence of mathematical operations (convolution, pooling, etc.) to perform object detection. Those operations are linked together to orchestrate a pipeline, so that image processing is not interrupted (Figure 4G). The operations can detect objects because they determine patterns in pixels (i.e., binary trait of computers; Shaw, 2004; Pietikäinen et al., 2011) from the input images that define features (Blum and Langley, 1997). Features are measurable variables that can be interpreted from images, such as shapes and textures of objects (Chandrashekar and Sahin, 2014). Algorithms trained to detect patterns from features automatically are called detection models. Before training the model, images are preprocessed to be enhanced (i.e., neutralize discriminations and scale dimensions) so that models can learn better (Nawi et al., 2013; Calmon et al., 2017), since data are generally noisy when captured in the real-world conditions. Recent artificial neural networks contain attention modules (Vaswani et al., 2017; Gupta et al., 2021) to capture long-range dependencies and understand what is going on in an image globally (Grauman and Leibe, 2011).
Figure 4 General process from in situ observations to behavior classification. (A), Representation of a section of an active gear (i.e., pelagic net) with a white-colored material that act as a clear background for video capture. (B), Representation of a passive gear (i.e., baited gear)-baitfish prototype fixed on the seafloor with a remote underwater video set-up. (C), Field of vision of a camera secured attached on one side of the pelagic net section. (D), Field of vision of a camera facing the bait. (E, F), Frames from video footage of the underwater observation systems. (G), General workflow for deep learning model application on object detection. (H, I), Sample of fish detections with bounding boxes and fish tracking with bounding boxes and line trails (Game of Trawls and Baitfish)., (J) Representation of behavior classification labels inside active gear. The “region of interest” labels the section of the gear near the exit and “escaping” labels the fishes that are exiting. (K), Representation of behavior classification labels with passive gear. The “region of interest” labels the area in proximity of the bait and “approaching” labels the fish within this proximity. 3D model of baited gear credit to BAITFISH project and image of fishes inside the pelagic net credit to Game of Trawls project.
Current deep learning methods are mostly “black boxes” since humans cannot see how individual neurons work together to compute the final output (e.g., why a fish in an image has been detected or not), so improving the accuracy of models relies on better inputs and comparison of trainings (LeCun et al., 2015). However, unsupervised learning is gaining more interest as it allows the transition from recognition to cognition (Forbus and Hinrichs, 2006; Xu et al., 2021). This means that innovations in the AI domain are now making interpretable models that can figure out why and how they localize and classify objects on a scene (Ribeiro et al., 2016; Hoffman et al., 2018; Gilpin et al., 2019). Among unsupervised learning models, Generative Adversarial Neural Networks (GAN) are composed of two networks: a generator that generates synthetic data and a discriminator that classify the data as real or fake. The generator learns how to fool the discriminator by learning the real data distribution and generating synthetic data that follow this distribution. The discriminator should not be able to distinguish real from synthetic data. Thus, object detection models can now be coupled to a GAN and learn by themselves, in a semi-supervised manner, by artificially generating new sets of images (from the generator model) that feed through another model: the object detector (e.g., generator model produces synthetic images of fishes for another model to detect them; Creswell et al., 2018). Applying these AI methods to fish interactions with fishing gears would enable us to decipher which behaviors lead to the catch and escapement of fish at more significant scales than what could be reached until today. For a comprehensive review on available deep learning-based architectures, see Aziz et al. (2020).
3.3 AI for fish behavior
Tools for automatic behavior recognition are being developed mainly in aquaculture (Valletta et al., 2017; Niu et al., 2018) and in coastal fish communities (e.g., Kim, 2003; Fisher et al., 2016; Capoccioni et al., 2019; Lopez-Marcano et al., 2021; Ditria et al., 2021a). Over the last decade, there has been an emergence of automatic fish detection, species classification, combined with tracking innovations, and this has contributed to a robust foundation for behavioral recognition. Behavioral studies of fishes in aquaculture looked at feeding behavior to monitor appetite and abnormal behaviors in intensive farming conditions (Kadri et al., 1991; Zhou et al., 2017; Niu et al., 2018; Måløy et al., 2019; Pylatiuk et al., 2019; Li et al., 2020). Behaviors that were automatically detected include: feeding movements at individual and school level, feeding intensity (Zhou et al., 2019), abnormal behaviors due to lack of oxygen or stress response (J. Wang et al., 2020), and curiosity by showing inspection behaviors when interacting with bait or objects in experimental set-up (Papadakis et al., 2012).
In laboratory experiments, goal-directed behaviors of fishes have also been recognized by computer vision and are automatically detected (Long et al., 2020) such as construction of spawning nests by cichlid fishes that either form mounds or burrow in the sand. This type of complex behavior can be distilled into recognizable patterns such as manipulation of their physical environment (cichlid fish use its mouth and fins to move sand) and distinct fish movements such as quivering (usual mating movement observed from cichlid fishes). Automatically recognizing these behavior patterns contributes to systematic analysis of these traits across taxa (York et al., 2015) and can be an effective metric for measuring natural variations (Long et al., 2020).
Artificial Intelligence methods trained to recognize fish behavior have multiple components that are all connected in branching streams of mathematical and statistical operations. From a video of swimming schools of fishes, the attributes of what is happening in the scene would be broken down into features of the fishes, their appearance in terms of shape, texture, or color, and their reaction to different types of stimuli translated into quantifiable metrics. Some additional examples of applications can be found in Spampinato et al. (2010); Fouad et al. (2014); Hernández-Serna and Jiménez-Segura (2014); Iqbal et al. (2021) and Lopez-Marcano et al. (2021).
3.3.1 AI-based automatic behavior recognition for fishes
Fish detection by AI models is when individuals or species are recognized on a single image (Sung et al., 2017). An algorithm is trained to identify features of fishes and localize regions in a scene. The YOLO (You Only Look Once; Redmon et al., 2016) object detection framework has been frequently used for fish detection and species classification on 2D images (Cai et al., 2020; Jalal et al., 2020; McIntosh et al., 2020; Raza and Hong, 2020; Bonofiglio et al., 2022; Knausgård et al., 2021). The YOLO algorithm and its different versions are widely used since its detecting speed on an entire image are faster and more accurate than classic object detectors (for technical specifications, see: Redmon et al., 2016). A trained detection model can thus differentiate targeted and non-targeted species, and identify differences between their morphology (i.e., round vs flat fish). Moreover, a cluster of individual detections can also illustrate herding behavior from crowd movements.
Identifying different swimming patterns between targeted and non-targeted species, however, requires tracking the spatial alignments of trajectories inside gears and directions of swimming through time, i.e., tracking. Fish tracking is done using motion algorithms based on successions of images with multiple or individual fish until they are no longer seen on the footage (Li et al., 2021). To track fishes, algorithms are thus trained as a single network or are coupled into a pipeline of networks for more complex behavior interpretations (Table 3). Different implementations of deep learning-based tracking have been used across studies, depending on their tracking objectives or available resources (for object detection: Faster R-CNN, for instance segmentation: Mask R-CNN, for tracking based on loss: Minimum Output Sum of Squared Error (MOSSE), for tracking based on comparing similarity among masks (similarity learning): Siamese Mask (SiamMask), and for tracking based on Non-Maximum Suppression (NMS) applied to sequences: Seq-NMS). Their differences lie on the way they compute detections from frame to frame and associate them to new or existing tracks of detected fishes (Lopez-Marcano et al., 2021). Coupled networks in AI pipelines are thus used for tracking to interpret finer details in behavior (Table 3).
To decipher underlying behavioral patterns of fishes from manual or automatically generated fish tracks, repeated patterns can be translated into sets of labelled classes (i.e., n number of trajectory moving in an x, y direction = escaping to upper panel), representing one or several specific behaviors. In AI, classes that can be labelled and quantified (i.e., fish passing a mesh) can be learned by a deep learning model so manual behavior classification can then be automated. In aquaculture, swimming behavior have been manually classified and fed through an algorithm that learns how to recognize the behavioral classes from computer vision (Long et al., 2020; J. Wang et al., 2020; Yu et al., 2021) . In commercial fishing, the challenge lies in deciphering these patterns as fishes interact with different structure of gears, modified parts, and selective devices. To have AI models classify these types of interactions, a systematic approach may thus be needed first in controlled environment, such as fish tanks or behavioral chambers. This would allow stimuli to be restricted and localized (Skinner, 2010) rather than being enhanced or inhibited by spatiotemporal conditions (Ryer and Olla, 2000; Owen et al., 2010; Maia and Volpato, 2013; Heydarnejad et al., 2017; Lomeli et al., 2021).
Recurrent AI models based on LSTM architecture targeting fish tracking are getting more attention since they are designed to give more weight to significant movement patterns among chaotic ones as they are trained. This adds a more cognitive ability to the learning of AI models. For instance, Gupta et al. (2021) investigated different vision-based object-tracking algorithms for multiple fishes in underwater scenes both in controlled and uncontrolled environments. They combined an object tracker designed with two complex networks (a siamese network and a recurrent network) named DFTNet (for Deep Fish Tracking Network). The first network used two identical neural networks to reidentify fish, and the second network is an LSTM that allows the AI model to learn from the fish’s chaotic motions.
In fishing activities, AI architecture with attention and memory is thus particularly important to address the chaotic patterns seen among species during capture process. Tracks can show swimming angles or abrupt changes in movement that measure distance from gear structures (Santos et al., 2020), mean trajectory in relation to the stimuli source (Peterson, 2022), selective device placement or difference in position of group or individual trajectories within gears. The visual features from automatic detection (i.e., color, texture, shape among species, group, or individual level) and the spatiotemporal features from tracking (i.e., swimming direction, angle, speed) (Figure 4H, I) can then be combined to define the behavior classification (Figure 4J, K).
3.3.2 Behavioral classes tailored with AI architecture
Fish behavior recognition is when a model can recognize a behavior based on tracking features identified as events. An event is a scene (Figure 4A, B) directly observed from videos, for example, when a group of fish swims out from fishing gear. The combination of fish detections and tracks (swimming patterns) can be categorized as a class “escapement”, and behavioral metrics can be derived from such events (see Figure 4J). Automatic behavior recognition is thus trained from classified sets of tracking features and is the final step in synthesizing chaotic fish swimming into distinguished sets of behaviors.
Classes of behaviors are defined by scientists and are used to label an image sequence or a video clip that shows a defined behavior. For example, a class label of escapement behavior can be defined from a clip of a fish passing through a mesh. This can be defined as when the detected body of the fish overlaps or touches the mesh. A behavior class of a fish not escaping is when the detected trajectory of the fish stays within the mesh barrier, or a class can consequently show it has escaped if the tracked fish is detected outside the gear. The option to label whether a fish has escaped is a detail that depends on the study’s classification decisions (i.e., either when the fish’s body is entirely outside the gear or as the fish passes through the mesh). Classes can also be separated into action, and non-action classes (see Table 3), where a defined behavior present in a video clip is labeled as the action class, and another clip presenting unchanged or normal fish movement is labeled with the non-action class. McIntosh et al. (2020) defined four features that translate the startling behavior of sablefish from their trajectories into measurable metrics: direction of travel, speed, aspect ratio, and Local Momentary Change metric. They combined the four features into a form suited to train an AI-based classifier with an LSTM architecture (i.e., tensor data). Like applying LSTM for tracking, an AI behavior recognizer with LSTM remembers important features efficiently to classify swimming movements (Niu et al., 2018; L. Yang et al., 2021). Behavior classes have been defined in selectivity studies as events classified in empirical models (Santos et al., 2016, Santos et al., 2020) or video tracking software (Noldus et al., 2001). J. Wang et al. (2020) proposed a method for real-world detections of anomalous behavior for multiple fish under high stress with a 3-stage pipeline. Examples of AI pipelines are summarized in Table 3, with the underwater scene, light source, and type of underwater observation system used included for comparison.
3.3.3 The problem of occlusion emphasized in the crowded scenes of fishing
The occlusion problem is when fishes overlap or swim behind one another, leading to a loss of fish detections and fragmentation of tracks (Gupta et al., 2021). Multiple objects tracking on videos is challenging since overlaps are flattened in a 2D view (See Figure 4C, D, F). This problem occurs when studying behaviors in crowded scenes of fishing. In 2D images and videos, training models to recognize the body parts of fish can help to overcome occlusion. In general, if a detector fails to locate an entire fish, a tracker can still follow the movement according to other features of the fish (i.e., fisheye, fins, tail). For example, Liu et al. (2019) simultaneously track the fish head and its center body so the head can be detected even when the center body is hidden. Therefore, trackers can maintain fish identity after occlusion happens if more appearance features are learned by the model (Qian et al., 2014). Fish heads have relatively fixed shapes and colors, so tracking them from frame to frame can still be done even after frequent occlusions (L. Yang et al., 2021). The darker color intensity of the head behind another and its elliptical shape can be characterized as a blob and still be tracked.
Three-dimensional tracking from stereo cameras or multiple camera systems where 3D components can be triangulated can help address occlusion problems. By reconstructing trajectories on a 3D view, fish trajectories are seen with depth, improving reidentifying a fish after an occlusion (Cachat et al., 2011; Huang et al., 2021) . However, AI models trained to recognize 3D trajectories demand computationally intensive algorithms to associate the deconstructed features together (L. Yang et al., 2021).
3.3.4 Transfer learning for data-deficient environments
We have shown that assessing fish behavior relies on analyzing trajectories. Considering tracks instead of detections generates even larger amounts of data than single detections of fishes on frames. Thousands to millions of such fish trajectories have likely been generated worldwide. These data may now be used to train models to detect fishes, at species level or as generic fish, in unseen environments. We provide a few examples of available published datasets that have been used to train models (Table 4).
Table 4 Summary of public datasets of fish images and videos for AI model training merged from open access database, from collection of generic image datasets (with other objects not focused on fishes) and from Ditria et al. (2020); Saleh et al. (2020) and Pedersen et al. (2021).
For tropical fishes, Fish4Knowledge (F4K; Fisher et al., 2016), a project that started in 2010, garnered millions of images from GoPro cameras that were set-up in coral reef areas of Taiwan. The project resulted to 87K hours of video (95 TB) and 145 million fish identifications. It has then made the successfully curated database available to the rest of the world and most of the developments in automatic classification and identification tools for fishes have used the database to train deep learning models (see in Table 4 uses of F4K: Spampinato et al., 2010; Palazzo and Murabito, 2014; Shafait et al., 2016; Jalal et al., 2020; Murugaiyan et al., 2021). For temperate fishes, only a few commercial species can be automatically identified by existing models but are nonetheless gaining more recognition. Bonofiglio et al. (2022) trained an AI pipeline to detect and track sablefish, Anoplopoma fimbria, in an underwater canyon in North America on ~650 hours of video recording with ~9000 manual annotations. Due to growing fish databases and application of image processing techniques, AI models can now detect fishes with human-like accuracy in some species such as Scythe butterfly fish (Benson et al., 2013), some tropical species (Spampinato et al., 2010), and mesopelagic species (Allken et al., 2021a).
Studying fish-gear interactions is particularly difficult due to the unique and challenging conditions often met at sea. Pipelines of automatic detections have applied transfer learning and data augmentation techniques to cope with the lack of available data. For example, Knausgård et al. (2021) applied transfer learning to train an AI system to identify temperate fishes that are commercially valuable, such as wrasses (Ctenolabrus rupestris, C. exoletus and Sympohodus melops) and gadoids (Gadus morhua, Pollachius virens, P. pollachius, Molva molva, and Melanogrammus aeglefinus). Using models pre-trained on available public datasets (see Table 4, e.g., Fish4Knowledge and ImageNet), they obtained high accuracies in object detection and classification using their fine-tuned models (86.96% and 99.27%, respectively). Transfer learning from pre-existing object detection algorithms coupled with existing data from other environments can thus be a promising approach for the automatic analysis of fish species even from environments that still lack data (Fisher et al., 2016; Siddiqui et al., 2018; Knausgård et al., 2021), additional augmentation methods, such as generating synthetic datasets, may help overcome the insufficiency of small datasets for training models (Allken et al., 2019; Villon et al., 2021).
4 Discussion
4.1 Insights from AI applications for behavior recognition from other domains
Automated behavior recognition has been applied to several domains outside of fisheries. Dynamic systems of fish schools, just as any large groups of moving individuals such as birds or insects (Chapman et al., 2010; Altshuler and Srinivasan, 2018), will produce a bundle of condensed and interloping trajectories when tracked. Directional patterns of behavior (i.e., individual or collective) can be interpreted from them (Sinhuber et al., 2019), but visual details of targets can be lost in footages due to occlusions or motion blur (Liu et al., 2016). Conveniently, apart from data enhancement methods, there are already available algorithms and AI methods that particularly addresses this challenge in natural systems of humans, social animals and insects (i.e., Swarm Intelligence; Ahmed and Glasgow, 2012, Boids algorithms; Alaliyat et al., 2014). Algorithms to track behavior in congested human crowds have been developed based on motion capture and optical flow techniques (Krausz and Bauckhage, 2011). Different types of human behavior can now be recognized by AI in all sorts of environment due to the considerable attention in the domain and since high performing models learn from a gigantic amount of training database of diverse human behavior (Popoola and Wang, 2012; Vinicius et al., 2013).
Three-dimensional motion capture techniques can also provide more information such as depth and detailed tracking of animal paths (Wu et al., 2009). Moreover, 3D trajectories can provide the analytics (i.e., positions, velocities, accelerations) to study cohesive and unique behaviors (Sinhuber et al., 2019). For instance, Liu et al. (2016) proposed an automatic tracking system that can reconstruct 3D trajectories of fruit flies using three high-speed cameras that can be generally adapted to large swarms of moving object. Dollár et al. (2012) made use of features of human pedestrians to geometrically quantify their overlaps and distances on a 2D scale. The AI models that recognize facial features and postures of humans or other animals therefore have the algorithmic backbone to extract behavior. Since algorithms can be scalable and adaptable (see Section 3.3.4 on transfer learning), such Al models may now be adapted to fish features and postures.
4.2 Towards smart fishing
The way we fish is constantly evolving. The more we understand the impact of fishing, the more we look for ways to make our fishing gears more selective. We are not just modifying the components of gears anymore but also adding devices and camera systems to them to create intelligent fishing gears. This turns fishing operations into interactive, fine-scale observations platforms rather than catch-then-see operations (Rosen et al., 2013; Kyllingstad et al., 2022). Performances of modified fishing gears can almost be assessed real-time which can elevate the plateau of gear selectivity studies by exploring fish-gear interactions at finer scales. The challenge now lies on obtaining consistent findings from these direct observations. In highly stimulating, crowded, and stressful scenes in fishing activities, subtle movements of fishes may turn into sharp and chaotic escapes where learned behavior and predispositions are overcome by survival instincts (Manière and Coureaud, 2020). Large volumes of fishes can also be influenced by herding behavior and individuals may tend to follow swimming routes of the group (Måløy et al., 2019). Addressing this herding constrain currently relies on applying complex pipelines, often coupled with stereovision (Rosen et al., 2013; Kyllingstad et al., 2022). Handling such data in real-time is one of the current bottlenecks because it has to be processed within embedded AI systems. To equip fishing gears, these embedded systems have to remain as light as possible, with controlled size, memory and power consumption. These issues will be partially solved as the algorithms presented above (see Section 3.3: The problem of occlusion emphasized in the crowded scenes of fishing and Table 3) keep improving in handling the occlusion problem, and as the observation systems keep improve to meet the image quality required for AI applications (see Section 2.1 Observations of fish behavior in fishing gears and Table 1).
In the meantime, AI may already facilitate the assessment of fishing gear modification. When a fishing gear is designed with a new stimulus (e.g., Southworth et al., 2020; Ruokonen et al., 2021) or when its parts are modified (e.g., Feekings et al., 2019), the certainty that they dominantly cause a change in behavior of fishes leading to escapes or retention is impossible to single out due the large variability in external and internal factors affecting the fishes’ responses. It is also unlikely that the exact movements by the same community of fishes can be observed upon two successive occasions (Ryer and Barnett, 2006; Ryer et al., 2010; Lomeli and Wakefield, 2019). Applying automatic behavior recognition in such situations would enable to process much larger amount of data on fine-scale differences than what could be done manually, even if it comes with some levels of errors inherent to using any fully automatic recognition algorithm (Faillettaz et al., 2016; Villon et al., 2021). Complementary laboratory studies may also help collect consistent findings (Hannah and Jones, 2012), which are needed to gather a database of automatically classifiable behaviors. For example, the influence of light intensity on juvenile walleye pollock Theragra chalcogramma were studied in laboratory conditions and in situ and showed that juveniles either struck the nets more often or swam closer to them in darkness than at the highest illumination (Olla et al., 2000). Such systematic behavioral responses could thus be used to train an AI model which could then be used to automatically analyze replicates of additional trials. Similarly, AI applications would enable to amplify the number of replicates of sea or laboratory trials, for example when assessing how changes in the positions of stimuli influences species behaviors (Larsen et al., 2017; Yochum et al., 2021).
4.3 Sharing and collaboration for the sake of fishes
Transferred learning of adaptable deep learning models from other behavioral studies and sceneries is key for automated fish behavior recognition, but technically executing this requires collaboration among the scientific community. The advances of fish behavior recognition in aquaculture and in situ environments often stem out of joint efforts between ecologists and computer scientists. AI practitioners mostly have the knowledge on which algorithm or AI network can be appropriated to specific study cases, while marine scientists provide the underlying ecological question and the inherent parameters (i.e., classification of fish behaviors, metrics for quantification) to fine-tune the algorithms. Automated behavior recognition models that are successful have benefited from huge streams of imagery data and unprecedented fundings in terms of technological specifications. Existing and previous data mining and collection practices included outsourcing efforts. Fish4Knowledge branched out to volunteers, subprojects, and gamifying techniques (Fisher et al., 2016). Popular datasets such as ImageNet and COCO used Amazon Analytics to crowdsource annotations of objects (Gauen et al., 2017). McClure et al. (2020) discussed that citizen science is beneficial for AI applied in ecological monitoring as it can fast track data collection since AI is now within reach because of integration in mobile devices and user-friendly platforms. The phytoplankton world is benefitting from citizen science as online portals are used by volunteers to do simple classification tasks that has led to millions of plankton ID’s to be verified (Robinson et al., 2017). Moreover, scientists are adapting FAIR (Findability, Accessibility, Interoperability, and Reuse) data principles to realize the full value of fish behavior data and to carefully curate a unifying database (Guidi et al., 2020; Bilodeau et al., 2022).
Bridging the gap between computer and marine sciences can accelerate the development of powerful tools for automated fish monitoring (Goodwin et al., 2021). User-friendly software platforms for image processing and analysis of animal tracks and events are publicly accessible and designed for non-AI experts (Dawkins et al., 2017). So even if observations of fish-gear interactions are more demanding in terms of observation requirements that can produce small sizes of data and are distinctly case-specific, training models can still be aided by means of data transfer, open-access databases, and participatory platforms. This will be beneficial for everyone as end-tools that grow in performance will also grow in scalability thanks to shared data. If there are enough collaborations across domains, extensive engagement with fish ethologists to construct behavioral classifiers, consistent sharing of reproducible, understandable, and scalable data then it might become possible to quantify, in near completeness, what a fish is doing or how it is interacting with its environment in any conditions.
4.4 Limitation of AI: A critical view
AI-adapted electronic fishing is still fairly new to fisheries so practical applications to improve selectivity of fishing gears may not be seen directly. AI models are dependent on the quality of the training data and imagery is still currently lacking. Contrary to fisheries-based observation, land- and air-based behavior studies have more opportunity to use AI for automatic behavior recognition as aerial and terrestrial devices can be smaller and lighter than underwater camera systems (e.g., Rosen and Holst (2013) for an underwater example; Liu et al. (2016) for a land example).
The environmental impact of these developing hardware and software systems in fisheries must not also be taken for granted. They may reduce operational energy consumption with automation but if intelligent tools are eventually applied in a commercial level, this may imply significant extraction of heavy metals to manufacture the hardware and increase in the carbon footprint of storage servers (Gupta et al., 2022). Scientists should be cautious to not be swept away by the promise of intelligent fishing without also seeking the environmental cost of making and maintaining it. AI application may tip the scale in favor of fishes but the integration of AI to fisheries must be accompanied by environmental impact assessments and an active search for alternative materials for machines.
Furthermore, our perception of animal behavior can be anthropomorphic, and this bias may be transferred to artificial tools. Researchers have consistently indicated the possible transfer of human bias into artificial intelligence that can be worsened by training models with limited data (Horowitz and Bekoff, 2015). As of today, human still need to be cautious in identifying behavioral both in manual and automatic methods; unsupervised learning may help get rid of anthropomorphic biases (Sengupta et al., 2018).
Another critical view of the use of AI in fisheries sustains the reality that it can be a double edge sword. On one hand, it may help scientists understand fish behavior and reduce bycatch (e.g., Knausgård et al., 2021; Sokolova et al., 2021). On the other hand, it may help the fishing industry to increase their catch with the use of automated tools (Walsh et al., 2002). As with any other technological advancement, the practical nature of it stems on how humans decide to use them (Bostrom and Yudkowsky, 2018). It is therefore in the hands of stakeholders to discuss among one another, to stress both the negative and positive impacts of AI, and to lay down ethical practices to prevent mishandling of this new technology. Debates in using AI tools in fisheries arise but if we go forward with the intention to help address ecological problems and emphasize its use for selectivity, then it may build the tools for a sustainable use of our resources.
4.5 Navigating a rapidly evolving field of research
The main challenge of studying fish-gear interactions is not of lack but of abundance. The growing data in fish behavior and existing footages of their interactions with gears carry with them the vital information for better gears waiting to be synthesized. Automating the methods of data collection and process not only unlatches the time and effort given by scientists from laborious practices but also liberates the focus unto deeper scientific and creative endeavors. User-friendly platforms that translate complex AI algorithms into software tools can encourage interest even from non-practitioners to participate in model training and fish tracking.
As we write this review, powerful and cognitive AI models in the field of computer science are advancing in an unparalleled speed. This will inevitably pour into the development of models for fisheries. AI applied in other sectors have cognitive understanding allowing machines to have higher level of ability of induction, reasoning and acquisition of knowledge. The evolution of future AI models for automatic recognition of fish-gear interactions now depends on multiple factors:
- First is the careful and accurate classification of fish trajectories that considers 3D components in a moving world.
- Second is the adaptation and re-training of pre-trained models from different human and animal behavioral studies.
- Third is the production of scalable and adaptable models for different case studies in gears and the shareability of fish behavior data among scientists.
- Fourth is the reliance on a continued and harmonious engagement of both marine scientists and AI practitioners to develop cognitive AI for fish-gear interaction systems.
There is no magic gear that completely selects targeted species, allow all unwanted species to escape, and has no economic and biological losses. However, equipping fishing gear with state-of-the-art technologies may help address ecological problems, understand overlooked species’ behavior and make our fishing practices more sustainable, laying the right track as we step into a technological era.
Author contributions
AA, DK, RF conceptualized the content of the review. AA conducted the bibliometric analysis, generated the illustrations and drafted the initial manuscript. DK, RF wrote, commented and reviewed the manuscript. All authors contributed to the article and approved the submitted version.
Funding
This work was done as a part of the Game of Trawls S2 project, funded by the European Maritime and Fisheries Fund and France Filière Pêche (PFEA390021FA1000002). AA’s PhD program is funded by IFREMER (grant DS/2021/10).
Acknowledgments
The authors are thankful to Abdelbadie Belmouhcine for his valuable insights on deep learning methods, to Sonia Méhault, Julien Simon and Pascal Larnaud for the discussion on gear selectivity, to Marie Savina-Rolland for her comments on the manuscript and to Megan Quimbre (IFREMER, Bibliothèque La Pérouse) for the bibliometric analysis. The authors are also grateful to the two reviewers for their time and criticisms which greatly improved the manuscript.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
Aguzzi J., Doya C., Tecchio S., de Leo F. C., Azzurro E., Costa C., et al. (2015). Coastal observatories for monitoring of fish behaviour and their responses to environmental changes. Rev. Fish Biol. Fish 25, 463–483. doi: 10.1007/s11160-015-9387-9
Ahmed H., Glasgow J. (2012). Swarm Intelligence: Concepts, Models and Applications. [online] Ontario, Canada: School of Computing Queen’s University Kingston. Available at: https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=116b67cf2ad2c948533e6890a9fccc5543dded89.
Alaliyat S., Yndestad H., Sanfilippo F. (2014). Optimisation of boids swarm model based on genetic algorithm and particle swarm optimisation algorithm (Comparative study). Proceedings - 28th European Conference on Modelling and Simulation, ECMS 2014. doi: 10.7148/2014-0643
Albawi S., Mohammed T. A., Al-Zawi S. (2017). “Understanding of a convolutional neural network,” in Proceedings of 2017 International Conference on Engineering and Technology, ICET 2017, Antalya, Turkey, 2018-January. 1–6. doi: 10.1109/ICENGTECHNOL.2017.8308186
Allken V., Handegard N. O., Rosen S., Schreyeck T., Mahiout T., Malde K. (2019). Fish species identification using a convolutional neural network trained on synthetic data. ICES J. Mar. Sci. 76, 342–349. doi: 10.1093/icesjms/fsy147
Allken V., Rosen S., Handegard N. O., Malde K. (2021a). A deep learning-based method to identify and count pelagic and mesopelagic fishes from trawl camera images. ICES J. Mar. Sci. 78, 3780–3792. doi: 10.1093/icesjms/fsab227
Allken V., Rosen S., Handegard N. O., Malde K. (2021b). A real-world dataset and data simulation algorithm for automated fish species identification. Geoscience Data Journal 8, 199–209. doi: 10.1002/gdj3.114
Alshdaifat N. F. F., Talib A. Z., Osman M. A. (2020). Improved deep learning framework for fish segmentation in underwater videos. Ecol. Inform 59, 101121. doi: 10.1016/j.ecoinf.2020.101121
Altshuler D. L., Srinivasan M. V. (2018). Comparison of visually guided flight in insects and birds. Front. Neurosci. 12. doi: 10.3389/fnins.2018.00157
Anantharajah K., Ge Z. Y., McCool C., Denman S., Fookes C., Corke P., et al. (2014). “Local inter-session variability modelling for object classification,” in 2014 IEEE Winter Conference on Applications of Computer Vision, WACV, Steamboat Springs, CO, USA. 309–316. doi: 10.1109/WACV.2014.6836084
Anders N., Fernö A., Humborstad O. B., Løkkeborg S., Rieucau G., Utne-Palm A. C. (2017a). Size-dependent social attraction and repulsion explains the decision of Atlantic cod Gadus morhua to enter baited pots. J. Fish Biol. 91, 1569–1581. doi: 10.1111/JFB.13453
Anders N., Fernö A., Humborstad O. B., Løkkeborg S., Utne-Palm A. C. (2017b). Species specific behaviour and catchability of gadoid fish to floated and bottom set pots. ICES J. Mar. Sci. 74, 769–779. doi: 10.1093/icesjms/fsw200
Arimoto T., Glass C. W., Zhang X. (2010)Fish vision and its role in fish capture. In: Behavior of marine fishes: Capture processes and conservation challenges. Available at: https://books.google.fr/books?hl=en&lr=&id=Rp28-2cAaD8C&oi=fnd&pg=PA25&ots=R4AIAl7dAS&sig=2gJyoWORuHB8iycWs3bu6s_BJug&redir_esc=y#v=onepage&q&f=false (Accessed June 29, 2022).
Aydin C., Tosunòlu Z. (2010). Selectivity of diamond, square and hexagonal mesh codends for Atlantic horse mackerel Trachurus trachurus, European hake Merluccius merluccius, and greater forkbeard Phycis blennoides in the eastern Mediterranean. J. Appl. Ichthyology 26, 71–77. doi: 10.1111/j.1439-0426.2009.01376.x
Aziz L., Salam H., Bin M., Sheikh U. U., Ayub S. (2020). Exploring deep learning-based architecture, strategies, applications and current trends in generic object detection: A comprehensive review. IEEE Access 8, 170461–170495. doi: 10.1109/ACCESS.2020.3021508
Baatrup E. (2009). Measuring complex behavior patterns in fish - effects of endocrine disruptors on the guppy reproductive behavior. Hum. Ecol. Risk Assess. 15, 53–62. doi: 10.1080/10807030802615097
Banerjee S., Alvey L., Brown P., Yue S., Li L., Scheirer W. J. (2021). An assistive computer vision tool to automatically detect changes in fish behavior in response to ambient odor. Sci. Rep. 11, 547. doi: 10.1038/s41598-020-79772-3
Barreiros M. de O., Dantas D.de O., Silva L.C.de O., Ribeiro S., Barros A. K. (2021). Zebrafish tracking using YOLOv2 and kalman filter. Sci. Rep. 11, 3219. doi: 10.1038/s41598-021-81997-9
Bekkozhayeva D., Saberioon M., Cisar P. (2021). Automatic individual non-invasive photo-identification of fish (Sumatra barb Puntigrus tetrazona) using visible patterns on a body. Aquaculture Int. 29, 1481–1493. doi: 10.1007/s10499-021-00684-8
Belmouhcine A., Simon J., Courtrai L., Lefevre S. (2021). “Robust deep simple online real-time tracking,” in 2021 12th International Symposium on Image and Signal Processing and Analysis, ISPA, Zagreb, Croatia, 2021-September. 138–144. doi: 10.1109/ISPA52656.2021.9552062
Benson B., Cho J., Goshorn D., Kastner R. (2013) Field programmable gate array (FPGA) based fish detection using haar classifiers. Available at: https://agris.fao.org/agris-search/search.do?recordID=AV2012071748 (Accessed July 7, 2022).
Ben Tamou A., Benzinou A., Nasreddine K. (2021). Multi-stream fish detection in unconstrained underwater videos by the fusion of two convolutional neural network detectors. Appl. Intell. 51, 5809–5821. doi: 10.1007/s10489-020-02155-8
Beyan C., Browman H. I. (2020). Setting the stage for the machine intelligence era in marine science. ICES J. Mar. Sci. 77, 1267–1273. doi: 10.1093/ICESJMS/FSAA084
Bilodeau S. M., Schwartz A. W. H., Xu B., Pauca V. P., Silman M. R. (2022). A low-cost, long-term underwater camera trap network coupled with deep residual learning image analysis. PloS One 17, e0263377. doi: 10.1371/JOURNAL.PONE.0263377
Blaxter J. H. S. (1988). ‘Sensory performance, behavior, and ecology of fish’. in Atema J., et al (eds.) Sensory Biol. Aquat. Anim. Berlin Heidelberg New York: Springer, 203–232. doi: 10.1007/978-1-4612-3714-3_8
Blum A. L., Langley P. (1997). Selection of relevant features and examples in machine learning. Artif. Intell. 97, 245–271. doi: 10.1016/S0004-3702(97)00063-5
Bonofiglio F., de Leo F. C., Yee C., Chatzievangelou D., Aguzzi J., Marini S. (2022). Machine learning applied to big data from marine cabled observatories: A case study of sablefish monitoring in the NE pacific. Front. Mar. Sci. 9. doi: 10.3389/fmars.2022.842946
Boom B. J., He J., Palazzo S., Huang P. X., Beyan C., Chou H.-M., et al. (2014). A research tool for long-term and continuous analysis of fish assemblage in coral-reefs using underwater camera footage. Ecol. Inform 23, 83–97. doi: 10.1016/j.ecoinf.2013.10.006
Bostrom N., Yudkowsky E. (2018). ‘The ethics of artificial intelligence’, in Frankish K., Ramsey W. M. (eds.) The Cambridge Handbook of Artificial Intelligence 1, 316–334. doi: 10.1017/CBO9781139046855.020
Boudhane M., Nsiri B. (2016). Underwater image processing method for fish localization and detection in submarine environment. J. Vis. Commun. Image Represent 39, 226–238. doi: 10.1016/j.jvcir.2016.05.017
Boulais O. E., Woodward B., Schlining B., Lundsten L., Barnard K., Croff Bell K., et al. (2020). FathomNet: An underwater image training database for ocean exploration and discovery. arXiv preprint arXiv:2007.00114. doi: 10.48550/arxiv.2007.00114
Bowmaker J. K., Kunz Y. W. (1987). Ultraviolet receptors, tetrachromatic colour vision and retinal mosaics in the brown trout (Salmon trutta): Age-dependent changes. Vision Res. 27, 2101–2108. doi: 10.1016/0042-6989(87)90124-6
Boyun V. P., Voznenko L. O., Malkush I. F. (20192019). Principles of organization of the human eye retina and their use in computer vision systems. Cybernetics Syst. Anal. 55, 5 55, 701–713. doi: 10.1007/S10559-019-00181-0
Breen M., Dyson J., O’Neill F. G., Jones E., Haigh M. (2004). Swimming endurance of haddock (Melanogrammus aeglefinus l.) at prolonged and sustained swimming speeds, and its role in their capture by towed fishing gears. ICES J. Mar. Sci. 61(7), 1071–1079. doi: 10.1016/j.icesjms.2004.06.014
Brinkhof J., Larsen R. B., Herrmann B., Sistiaga M. (2020). Size selectivity and catch efficiency of bottom trawl with a double sorting grid and diamond mesh codend in the north-east Atlantic gadoid fishery. Fish Res. 231, 105647. doi: 10.1016/j.fishres.2020.105647
Brown C., Laland K., Krause J. (2006). Fish cognition and behavior. Fish Cogn. Behav. page1. doi: 10.1002/9780470996058
Bullough L. W., Napier I. R., Laurenson C. H., Riley D., Fryer R. J., Ferro R. S. T., et al. (2007). A year-long trial of a square mesh panel in a commercial demersal trawl. Fish Res. 83, 105–112. doi: 10.1016/J.FISHRES.2006.09.008
Cachat J. M., Stewart A., Utterback E., Kyzar E., Hart P. C., Carlos D., et al. (2011). Deconstructing adult zebrafish behavior with swim trace visualizations. Neuromethods 51, 191–201. doi: 10.1007/978-1-60761-953-6_16
Cai K., Miao X., Wang W., Pang H., Liu Y., Song J. (2020). A modified YOLOv3 model for fish detection based on MobileNetv1 as backbone. Aquac Eng. 91, 102117. doi: 10.1016/j.aquaeng.2020.102117
Calmon F. P., Wei D., Vinzamuri B., Ramamurthy K. N., Varshney K. R. (2017). arXiv preprint arXiv:1703.02476
Cao S., Zhao D., Sun Y., Ruan C. (2021). Learning-based low-illumination image enhancer for underwater live crab detection. ICES J. Mar. Sci. 78, 979–993. doi: 10.1093/icesjms/fsaa250
Capoccioni F., Leone C., Pulcini D., Cecchetti M., Rossi A., Ciccotti E. (2019). Fish movements and schooling behavior across the tidal channel in a Mediterranean coastal lagoon: An automated approach using acoustic imaging. Fish Res. 219, 105318. doi: 10.1016/j.fishres.2019.105318
Carleton K. L., Escobar-Camacho D., Stieb S. M., Cortesi F., Justin Marshall N. (2020). Seeing the rainbow: Mechanisms underlying spectral sensitivity in teleost fishes. J. Exp. Biol. 223. doi: 10.1242/JEB.193334/223810
Catania K. C., Hare J. F., Campbell K. L. (2008). Water shrews detect movement, shape, and smell to find prey underwater. Proceedings of the National Academy of Sciences 105(2), 571–576. doi: 10.1073/pnas.0709534104
Chandrashekar G., Sahin F. (2014). A survey on feature selection methods. Comput. Electrical Eng. 40, 16–28. doi: 10.1016/J.COMPELECENG.2013.11.024
Chapman C. J., Hawkins A. D. (1973). A field study of hearing in the cod,Gadus morhua l. J. Comp. Physiol. 2 (85), 147–167. doi: 10.1007/BF00696473
Chapman J. W., Nesbit R. L., Burgin L. E., Reynolds D. R., Smith A. D., Middleton D. R., et al. (2010). Flight orientation behaviors promote optimal migration trajectories in high-flying insects. Science 327, 682–685. doi: 10.1126/science.1182990
Cheng S., Zhao K., Zhang D. (2019). Abnormal water quality monitoring based on visual sensing of three-dimensional motion behavior of fish. Symmetry 11, 1179. doi: 10.3390/sym11091179
Chen Q., Zhang C., Zhao J., Ouyang Q. (2013). Recent advances in emerging imaging techniques for non-destructive detection of food quality and safety. TrAC Trends Analytical Chem. 52, 261–274. doi: 10.1016/J.TRAC.2013.09.007
Chidami S., Guénard G., Amyot M. (2007). Underwater infrared video system for behavioral studies in lakes. Limnol Oceanogr Methods 5, 371–378. doi: 10.4319/lom.2007.5.371
Christensen J. H., Mogensen L. V., Galeazzi R., Andersen J. C. (2018). Detection, localization and classification of fish and fish species in poor conditions using convolutional neural networks; detection, localization and classification of fish and fish species in poor conditions using convolutional neural networks. IEEE/OES Autonomous Underwater Vehicle Workshop (AUV), Porto, Portugal, 1–6. doi: 10.1109/AUV.2018.8729798
Chua Y., Tan C., Lee Z., Chai T., Seet G., Sluzek A. (2011). Using MTF with fixed-zoning method for automated gated imaging system in turbid medium. Indian J. Mar. Sci. 40, 236–241.
Ciaparrone G., Luque Sánchez F., Tabik S., Troiano L., Tagliaferri R., Herrera F. (2020). Deep learning in video multi-object tracking: A survey. Neurocomputing 381, 61–88. doi: 10.1016/J.NEUCOM.2019.11.023
Connolly R. M., Fairclough D., Jinks E. L., Ditria E. M., Jackson G., Lopez-Marcano S., et al. (2021). Improved accuracy for automated counting of a fish in baited underwater videos for stock assessment. Front. Mar. Sci. 8. doi: 10.3389/fmars.2021.658135
Cooke S. J., Cech J. J., Glassman D. M., Simard J., Louttit S., Lennox R. J., et al. (2020). Water resource development and sturgeon (Acipenseridae): state of the science and research gaps related to fish passage, entrainment, impingement and behavioural guidance. Rev. Fish Biol. Fish 30, 219–244. doi: 10.1007/s11160-020-09596-x
Crescitelli A. M., Gansel L. C., Zhang H. (2021). NorFisk: fish image dataset from Norwegian fish farms for species recognition using deep neural networks. Modeling, Identification and Control: A Norwegian Research Bulletin 42, 1–16. doi: 10.4173/MIC.2021.1.1
Creswell A., White T., Dumoulin V., Arulkumaran K., Sengupta B., Bharath A. A. (2018). Generative adversarial networks: An overview. IEEE Signal Process Mag 35, 53–65. doi: 10.1109/MSP.2017.2765202
Cuende E., Arregi L., Herrmann B., Sistiaga M., Aboitiz X. (2020a). Prediction of square mesh panel and codend size selectivity of blue whiting based on fish morphology. ICES J. Mar. Sci. 77, 2857–2869. doi: 10.1093/icesjms/fsaa156
Cuende E., Arregi L., Herrmann B., Sistiaga M., Onandia I. (2020b). Stimulating release of undersized fish through a square mesh panel in the Basque otter trawl fishery. Fish Res. 224, 105431. doi: 10.1016/J.FISHRES.2019.105431
Cuende E., Herrmann B., Sistiaga M., Basterretxea M., Edridge A., Mackenzie E. K., et al. (2022). Species separation efficiency and effect of artificial lights with a horizonal grid in the Basque bottom trawl fishery. Ocean Coast. Manag 221, 106105. doi: 10.1016/J.OCECOAMAN.2022.106105
Cui S., Zhou Y., Wang Y., Zhai L. (2020). Fish detection using deep learning. Appl. Comput. Intell. Soft Computing. 6, 66. doi: 10.1155/2020/3738108
Cunningham P., Cord M., Delany S. J. (2008). Supervised learning, in Machine Learning Techniques for Multimedia Berlin Heidelberg, Germany 1, 21–49. doi: 10.1007/978-3-540-75171-7_2
Cutter G., Stierhoff K., Zeng J. (2015). “Automated detection of rockfish in unconstrained underwater videos using haar cascades and a new image dataset: Labeled fishes in the wild,” in Proceedings - 2015 IEEE Winter Conference on Applications of Computer Vision Workshops, , WACVW 2015. 57–62. doi: 10.1109/WACVW.2015.11
Dawkins M., Sherrill L., Fieldhouse K., Hoogs A., Richards B., Zhang D., et al. (2017). “An open-source platform for underwater image & video analytics,” in Proceedings - 2017 IEEE Winter Conference on Applications of Computer Vision (WACV), Santa Rosa, CA, USA 2017, 898–906 (Institute of Electrical and Electronics Engineers Inc). doi: 10.1109/WACV.2017.105
Dealteris J. T., Reifsteck D. M. (1993). Escapement and survival of fish from the codend of a demersal trawl. ICES m ar. Sci. Sym 196, 128–131.
DeCelles G. R., Keiley E. F., Lowery T. M., Calabrese N. M., Stokesbury K. D. (2017). Development of a video trawl survey system for New England groundfish. Transactions of the American Fisheries Society 146(3), 462–477. doi: 10.1080/00028487.2017.1282888
de Robertis A., Handegard N. O. (2013). Fish avoidance of research vessels and the efficacy of noise-reduced vessels: A review. ICES J. Mar. Sci. 70, 34–45. doi: 10.1093/icesjms/fss155
Dijkgraaf S. (1960). Hearing in bony fishes. Proc. R Soc. Lond B Biol. Sci. 152, 51–54. doi: 10.1098/RSPB.1960.0022
Ditria E. M., Connolly R. M., Jinks E. L., Lopez-Marcano S. (2021a). Annotated video footage for automated identification and counting of fish in unconstrained seagrass habitats. Front. Mar. Sci. 8, 629485. doi: 10.3389/fmars.2021.629485
Ditria E. M., Jinks E. L., Connolly R. M. (2021b). Automating the analysis of fish grazing behaviour from videos using image classification and optical flow. Anim. Behav. 177, 31–37. doi: 10.1016/j.anbehav.2021.04.018
Ditria E. M., Sievers M., Lopez-Marcano S., Jinks E. L., Connolly R. M. (2020). Deep learning for automated analysis of fish abundance: the benefits of training across multiple habitats. Environ. Monit Assess. 192. doi: 10.1007/s10661-020-08653-z
Doksæter L., Handegard N. O., Godø O. R., Kvadsheim P. H., Nordlund N. (2012). Behavior of captive herring exposed to naval sonar transmissions (1.0–1.6 kHz) throughout a yearly cycle. J. Acoust Soc. Am. 131, 1632–1642. doi: 10.1121/1.3675944
Dollár P., Wojek C., Schiele B., Perona P. (2012). Pedestrian detection: An evaluation of the state of the art. IEEE Trans. Pattern Anal. Mach. Intell. 34, 743–761. doi: 10.1109/TPAMI.2011.155
Duecker D. A., Hansen T., Kreuzer E. (2020). “RGB-D camera-based navigation for autonomous underwater inspection using low-cost micro AUVs,” in 2020 IEEE/OES Autonomous Underwater Vehicles Symposium AUV, St. Johns, NL, Canada. doi: 10.1109/AUV50043.2020.9267890
Durden J. M., Schoening T., Althaus F., Friedman A., Garcia R., Glover A. G., et al. (2016). Perspectives in visual imaging for marine biology and ecology: from acquisition to understanding. Oceanography Mar. Biology: Annu. Rev. 54, 315–366. doi: 10.1201/9781315368597
Eickholt J., Kelly D., Bryan J., Miehls S., Zielinski D. (2020). Advancements towards selective barrier passage by automatic species identification: Applications of deep convolutional neural networks on images of dewatered fish. ICES J. Mar. Sci. 77, 2804–2813. doi: 10.1093/icesjms/fsaa150
Ellis W. L., Stadler J. (2005). Application of an in situ infrared camera system for evaluating icthyofaunal utilization of restored and degraded mangrove habitats: developing a set of reference conditions from a NERRS site. Final Report. NOAA/UNH Cooperative Institute for Coastal and Estuarine Environmental Technology (CICEET)
Everingham M., Winn J. (2012). The PASCAL Visual Object Challenge 2012 (VOC2012) Results. http://www.pascalnetwork.org/challenges/VOC/voc2012/workshop/index.html.
Faillettaz R., Picheral M., Luo J. Y., Guigand C., Cowen R. K., Irisson J. O. (2016). Imperfect automatic image classification successfully describes plankton distribution patterns. Methods Oceanography 15–16, 60–77. doi: 10.1016/j.mio.2016.04.003
Feekings J., O’Neill F. G., Krag L., Ulrich C., Veiga Malta T. (2019). An evaluation of European initiatives established to encourage industry-led development of selective fishing gears. Fish Manag Ecol. 26, 650–660. doi: 10.1111/FME.12379
Ferro R. S. T., Jones E. G., Kynoch R. J., Fryer R. J., Buckett B. E. (2007). Separating species using a horizontal panel in the Scottish north Sea whitefish trawl fishery. ICES J. Mar. Sci. 64, 1543–1550. doi: 10.1093/ICESJMS/FSM099
Fier R., Albu A. B., Hoeberechts M. (20152014). Automatic fish counting system for noisy deep-sea videos. 2014 Oceans - St. John’s OCEANS. St. John's, NL, Canada, 2014, 1–6. doi: 10.1109/OCEANS.2014.7003118
Fisher R., Chen-Burger Y.-H., Giordano D., Hardman L., Lin F.-P. (Eds.). (2016). Fish4Knowledge: Collecting and analyzing massive coral reef fish video data 104, 319. Berlin Heidelberg, Germany: Springer. doi: 10.1007/978-3-319-30208-9
Fonseca P., Martins R., Campos A., Sobral P. (2005). Gill-net selectivity off the Portuguese western coast. Fish Res. 73, 323–339. doi: 10.1016/j.fishres.2005.01.015
Forbus K. D., Hinrichs T. R. (2006). Companion cognitive systems: A step toward human-level AI. AI Mag 27, 83–83. doi: 10.1609/AIMAG.V27I2.1882
Forlim C. G., Pinto R. D. (2014). Automatic realistic real time stimulation/recording in weakly electric fish: Long time behavior characterization in freely swimming fish and stimuli discrimination. PloS One 9, e84885. doi: 10.1371/journal.pone.0084885
Fouad M. M. M., Zawbaa H. M., El-Bendary N., Hassanien A. E. (2014). “Automatic Nile tilapia fish classification approach using machine learning techniques,” in 13th International Conference on Hybrid Intelligent Systems, HIS 2013, Gammath, Tunisia. 173–178. doi: 10.1109/HIS.2013.6920477
Gan W. S., Yang J., Kamakura T. (2012). A review of parametric acoustic array in air. Appl. Acoustics 73, 1211–1219. doi: 10.1016/J.APACOUST.2012.04.001
Gauen K., Dailey R., Laiman J., Zi Y., Asokan N., Lu Y. H., et al. (2017). “Comparison of visual datasets for machine learning,” in 2017 IEEE International Conference on Information Reuse and Integration (IRI), San Diego, CA, USA. 346–355. doi: 10.1109/IRI.2017.59
Gilpin L. H., Bau D., Yuan B. Z., Bajwa A., Specter M., Kagal L. (2019). “Explaining explanations: An overview of interpretability of machine learning,” in 2018 IEEE 5th International Conference on Data Science and Advanced Analytics (DSAA), Turin, Italy. 80–89. doi: 10.1109/DSAA.2018.00018
Glass C. W., Wardle C. S., Glass S. J. G., Wardle C. W., Gosden C. S., Behavioural S. J., et al. (1993). Behavioural studies of the principles underlying mesh penetration by fish. ICES Mar. Sei. Symp 196, 92–97.
Glass C. W., Wardle C. S., Gosden S. J., Racey D. N. (1995). Studies on the use of visual stimuli to control fish escape from codends. i. laboratory studies on the effect of a black tunnel on mesh penetration. Fish Res. 23, 157–164. doi: 10.1016/0165-7836(94)00330-Y
Goldsmith T. H., Fernandez H. R. (1968). Comparative studies of crustacean spectral sensitivity. Z. Vergl. Physiol. 60, 156–175. doi: 10.1007/BF00878449
Goodwin M., Halvorsen K. T., Jiao L., Knausgård K. M., Martin A. H., Moyano M., et al. (2021) Unlocking the potential of deep learning for marine ecology: overview, applications, and outlook. Available at: http://arxiv.org/abs/2109.14737.
Graham N. (2003). By-catch reduction in the brown shrimp, crangon crangon, fisheries using a rigid separation nordmøre grid (grate). Fish Res. 59, 393–407. doi: 10.1016/S0165-7836(02)00015-2
Graham N., Jones E., Reid D. G. (2004). Review of technological advances for the study of fish behaviour in relation to demersal fishing trawls. ICES J. Mar. Sci. 61(7), 1036–1043. doi: 10.1016/j.icesjms.2004.06.006
Grauman K., Leibe B. (2011). Visual object recognition. Synthesis Lectures Artif. Intell. Mach. Learn. 11, 1–180. doi: 10.2200/S00332ED1V01Y201103AIM011
Guidi L., Guerra A. F., Canchaya C., Curry E., Foglini F., Irisson J. O., et al. (2020). “Big data in marine science,“ in Future Science Brief 6 of the European Marine Board, eds Alexander B., Heymans J. J., Muñiz Piniella A., Kellett P., Coopman J. (Ostend: European Marine Board), 1–52. doi: 10.5281/ZENODO.3755793
Guo Z., Zhang L., Jiang Y., Niu W., Gu Z., Zheng H., et al. (2020). Few-shot fish image generation and classification (Singapore - U.S. Gulf Coast: 2020 Global Oceans 2020). doi: 10.1109/IEEECONF38699.2020.9389005
Gupta U., Kim Y. G., Lee S., Tse J., Lee H. H. S., Wei G. Y., et al. (2022). Chasing carbon: The elusive environmental footprint of computing. IEEE Micro 42, 37–47. doi: 10.1109/MM.2022.3163226
Gupta S., Mukherjee P., Chaudhury S., Lall B., Sanisetty H. (2021). DFTNet: Deep fish tracker with attention mechanism in unconstrained marine environments. IEEE Trans. Instrum Meas 70, 1–13. doi: 10.1109/TIM.2021.3109731
Guthrie D. M. (1986). Role of vision in fish behaviour (75–113: The Behaviour of Teleost Fishes). doi: 10.1007/978-1-4684-8261-4_4
Hannah R. W., Jones S. A. (2012). Evaluating the behavioral impairment of escaping fish can help measure the effectiveness of bycatch reduction devices. Fish Res. 131–133, 39–44. doi: 10.1016/J.FISHRES.2012.07.010
Haro A., Miehls S., Johnson N. S., Wagner C. M. (2020). Evaluation of visible light as a cue for guiding downstream migrant juvenile Sea lamprey. Trans. Am. Fish Soc. 149, 635–647. doi: 10.1002/tafs.10261
Harpaz R., Tkačik G., Schneidman E. (2017). Discrete modes of social information processing predict individual behavior of fish in a group. Proc. Natl. Acad. Sci. U.S.A. 114, 10149–10154. doi: 10.1073/PNAS.1703817114
He P. (1993). Swimming speeds of marine fish in relation to fishing gears. ICES Mar. Sci. Symp. 196, 183–189.
He P. (2010). Behavior of marine fishes : capture processes and conservation challenges Wiley-Blackwell, Iowa.
He P., Chopin F., Suuronen P., Ferro R., Lansley J. (2021). Classification and illustrated definition of fishing gears. FAO Fisheries and Aquaculture Technical Paper 672 (2021), p. I–94. UnitedNations Food and Agriculture Organization (FAO), Rome. doi: 10.4060/cb4966en
Hernández-Serna A., Jiménez-Segura L. F. (2014). Automatic identification of species with neural networks. PeerJ. 2, p.e563. doi: 10.7717/peerj.563
Heydarnejad M. S., Fattollahi M., Khoshkam M. (2017). Influence of light colours on growth and stress response of pearl gourami trichopodus leerii under laboratory conditions. J. Ichthyology 6 (57), 908–912. doi: 10.1134/S0032945217060054
Hoffman R. R., Mueller S. T., Klein G., Litman J. (2018). Metrics for explainable AI: Challenges and prospects. arXiv preprint arXiv: 2007.00114. doi: 10.48550/arxiv.1812.04608
Holbrook R. I., Burt de Perera T. (2009). Separate encoding of vertical and horizontal components of space during orientation in fish. Anim. Behav. 78, 241–245. doi: 10.1016/j.anbehav.2009.03.021
Horowitz A. C., Bekoff M. (2015). Naturalizing anthropomorphism: Behavioral prompts to our humanizing of animals. Anthrozoös 20(1), 20, 23–35. doi: 10.2752/089279307780216650
Hossain E., Alam S. M. S., Ali A. A., Amin M. A. (2016). “Fish activity tracking and species identification in underwater video,” in 2016 5th International Conference on Informatics, Electronics and Vision (ICIEV), Dhaka, Bangladesh. 62–66. doi: 10.1109/ICIEV.2016.7760189
Hsiao Y. H., Chen C. C., Lin S. I., Lin F. P. (2014). Real-world underwater fish recognition and identification, using sparse representation. Ecol. Inform 23, 13–21. doi: 10.1016/j.ecoinf.2013.10.002
Huang K., Han Y., Chen K., Pan H., Zhao G., Yi W., et al. (2021). A hierarchical 3D-motion learning framework for animal spontaneous behavior mapping. Nat. Commun. 1 (12), 1–14. doi: 10.1038/s41467-021-22970-y
Huang D., Zhao D., Wei L., Wang Z., Du Y. (2015). Modeling and analysis in marine big data: Advances and challenges. Math Probl Eng. 2015, pp. 1–13. doi: 10.1155/2015/384742
Hu J., Zhao D., Zhang Y., Zhou C., Chen W. (2021). Real-time nondestructive fish behavior detecting in mixed polyculture system using deep-learning and low-cost devices. Expert Syst. Appl. 178, 115051. doi: 10.1016/j.eswa.2021.115051
Iqbal M. A., Wang Z., Ali Z. A., Riaz S. (2021). Automatic fish species classification using deep convolutional neural networks. Wirel Pers. Commun. 116, 1043–1053. doi: 10.1007/s11277-019-06634-1
Jäger J., Simon M., Jaeger J., Denzler J., Wolff V., Fricke-Neuderth K., et al. (2015). Croatian Fish dataset: Fine-grained classification of fish species in their natural habitat, in Amaral T., Matthews S., Plötz T., McKenna S., Fisher R. (eds.), Proceedings of the Machine Vision of Animals and their Behaviour (MVAB), pp. 6.1-6.7. doi: 10.5244/C.29.MVAB.6
Jahanbakht M., Xiang W., Hanzo L., Azghadi M. R. (2021). Internet Of underwater things and big marine data analytics - a comprehensive survey. IEEE Commun. Surveys Tutorials 23, 904–956. doi: 10.1109/COMST.2021.3053118
Jalal A., Salman A., Mian A., Shortis M., Shafait F. (2020). Fish detection and species classification in underwater environments using deep learning with temporal information. Ecol. Inform 57, 101088. doi: 10.1016/j.ecoinf.2020.101088
Joly A., Goëau H., Glotin H., Spampinato C., Bonnet P., Vellinga W.-P., et al. (2016). LifeCLEF 2016: Multimedia life species identification challenges. In Proceedings of the 2016 International Conference of the Cross-Language Evaluation Forum for European Languages (CLEF), Evora, Portugal. 286–310. doi: 10.1007/978-3-319-44564-9_26ï
Jones M. J., Hale R. (2020). Using knowledge of behaviour and optic physiology to improve fish passage through culverts. Fish Fisheries 21, 557–569. doi: 10.1111/faf.12446
Jones E. G., Summerbell K., O’Neill F. (2008). The influence of towing speed and fish density on the behaviour of haddock in a trawl cod-end. Fish Res. 94, 166–174. doi: 10.1016/j.fishres.2008.06.010
Jordan L. K., Mandelman J. W., McComb D. M., Fordham S., Carlson J. K., Werner T. B. (2013). Linking sensory biology and fisheries bycatch reduction in elasmobranch fishes: a review with new directions for research. Conserv. Physiol. 1, 1–20. doi: 10.1093/CONPHYS/COT002
Kadri S., Metcalfe N. B., Huntingford F. A., Thorpe J. E. (1991). Daily feeding rhythms in Atlantic salmon in sea cages. Aquaculture 92, 219–224. doi: 10.1016/0044-8486(91)90023-Z
Kaimmer S., Stoner A. W. (2008). Field investigation of rare-earth metal as a deterrent to spiny dogfish in the pacific halibut fishery. Fish Res. 94, 43–47. doi: 10.1016/J.FISHRES.2008.06.015
Karlsen J. D., Melli V., Krag L. A. (2021). Exploring new netting material for fishing: The low light level of a luminous netting negatively influences species separation in trawls. ICES J. Mar. Sci. 78, 2818–2829. doi: 10.1093/icesjms/fsab160
Katija K., Orenstein E., Schlining B., Lundsten L., Barnard K., Sainz G., et al. (2021a). FathomNet: A global image database for enabling artificial intelligence in the ocean. arXiv preprint doi: 10.48550/arxiv.2109.14646
Katija K., Roberts P., Daniels J., Lapides A., Barnard K., Risi M., et al. (2021b). Visual tracking of deepwater animals using machine learning-controlled robotic underwater vehicles. In Proceedings of the IEEE/CVF winter conference on applications of computer vision, 860–869.
Kay J., Merrifield M. (2021). The fishnet open images database: A dataset for fish detection and fine-grained categorization in fisheries. arXiv preprint arXiv:2106.09178. doi: 10.48550/arxiv.2106.09178
Kim Y.-H. (2003). Numerical modeling of chaotic behavior for small-scale movements of demersal fishes in coastal water. Fisheries Sci. 69, 535–546. doi: 10.1046/j.0919-9268.2003.00654.x
Kim Y.-H., Wardle C. S. (2003). Optomotor response and erratic response: quantitative analysis of fish reaction to towed fishing gears. Fisheries Research 60, 455–470. doi: 1016/S0165-7836(02)00114-5
Kim Y. H., Wardle C. S. (2005). Basic modelling of fish behaviour in a towed trawl based on chaos in decision-making. Fish Res. 73, 217–229. doi: 10.1016/j.fishres.2004.12.003
Kim Y. H., Wardle C. S., An Y. S. (2008). Herding and escaping responses of juvenile roundfish to square mesh window in a trawl cod end. Fisheries Sci. 74, 1–7. doi: 10.1111/j.1444-2906.2007.01490.x
Knausgård K. M., Wiklund A., Sørdalen T. K., Halvorsen K. T., Kleiven A. R., Jiao L., et al. (2021). Temperate fish detection and classification: a deep learning based approach. Appl. Intell. 52(6), 6988–7001. doi: 10.1007/s10489-020-02154-9
Krag L. A., Madsen N., Karlsen J. D. (2009). A study of fish behaviour in the extension of a demersal trawl using a multi-compartment separator frame and SIT camera system. Fish Res. 98, 62–66. doi: 10.1016/J.FISHRES.2009.03.012
Krausz B., Bauckhage C. (2011). “Analyzing pedestrian behavior in crowds for automatic detection of congestions,” in 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), Barcelona, Spain. 144–149. doi: 10.1109/ICCVW.2011.6130236
Kunz Y. W. (2006). Review of development and aging in the eye of teleost fish. Neuroembryology Aging 4, 31–60. doi: 10.1159/000103451
Kyllingstad L. T., Reite K.-J., Haugen J., Ladstein J. (2022) SMARTFISH H2020 D5.3: FishData analysis (Open access revision) (SINTEF Ocean). Available at: https://sintef.brage.unit.no/sintef-xmlui/handle/11250/3013186 (Accessed January 11, 2023).
Løkkeborg S. (1990). Rate of release of potential feeding attractants from natural and artificial bait. Fish Res. 8, 253–261. doi: 10.1016/0165-7836(90)90026-R
Laan A., Iglesias-Julios M., de Polavieja G. G. (2018). Zebrafish aggression on the sub-second time scale: evidence for mutual motor coordination and multi-functional attack manoeuvres. R Soc. Open Sci. 5, 180679. doi: 10.1098/RSOS.180679
Langlois T., Goetze J., Bond T., Monk J., Abesamis R. A., Asher J., et al. (2020). A field and video annotation guide for baited remote underwater stereo-video surveys of demersal fish assemblages. Methods Ecol. Evol. 11, 1401–1409. doi: 10.1111/2041-210X.13470
Laradji I., Saleh A., Rodriguez P., Nowrouzezahrai D., Azghadi M. R., Vazquez D. (2020). Affinity LCFCN: Learning to segment fish with weak supervision. arXiv preprint arXiv:2011.03149. doi: 10.48550/arxiv.2011.03149
Larsen R. B., Herrmann B., Sistiaga M., Brinkhof J., Tatone I., Langård L. (2017). Performance of the Nordmøre grid in shrimp trawling and potential effects of guiding funnel length and light stimulation. Mar. Coast. Fisheries. 9(1), 479–492. doi: 10.1080/19425120.2017.1360421
Larsen R. B., Larsen I. (1993). Size selectivity of rigid sorting grids in bottom trawls for Atlantic cod (Gadus morhua) and haddock (Melanogrammus aeglefinus). ICES Mar Sci Symp 196, 178–182.
LeCun Y., Bengio Y., Hinton G. (2015). Deep learning. Nature 7553 (521), 436–444. doi: 10.1038/nature14539
Lee D.-J., Schoenberger R. B., Shiozawa D., Xu X., Zhan P. (2004). Contour matching for a fish recognition and migration-monitoring system. Two-and Three-Dimensional Vision Systems for Inspection, Control, and Metrology II, 5606, 37–48. doi: 10.1117/12.571789
Lillywhite K. D., Lee D. J. (2013) Robotic vision lab (Brigham Young University, Fish dataset). Available at: http://roboticvision.groups.et.byu.net/Machine_Vision/BYUFish/BYU_Fish.html (Accessed August 1, 2022).
Liu Y., Wang S., Chen Y. Q. (2016). Automatic 3D tracking system for large swarm of moving objects. Pattern Recognit 52, 384–396. doi: 10.1016/J.PATCOG.2015.11.014
Liu X., Yue Y., Shi M., Qian Z. M. (2019). 3-d video tracking of multiple fish in a water tank. IEEE Access 7, 145049–145059. doi: 10.1109/ACCESS.2019.2945606
Li D., Wang G., Du L., Zheng Y., Wang Z. (2022). Recent advances in intelligent recognition methods for fish stress behavior. Aquac Eng. 96, 102222. doi: 10.1016/J.AQUAENG.2021.102222
Li D., Wang Z., Wu S., Miao Z., Du L., Duan Y. (2020). Automatic recognition methods of fish feeding behavior in aquaculture: A review. Aquaculture 528. doi: 10.1016/j.aquaculture.2020.735508
Li J., Zhu K., Wang F., Jiang F. (2021). Deep neural network-based real time fish detection method in the scene of marine fishing supervision. J. Intelligent Fuzzy Syst. 41, 4527–4532. doi: 10.3233/JIFS-189713
Logares R., Alos J., Catalan I., Solana A. C., Javier del Ocampo F. (2021). “Oceans of big data and artificial intelligence,” Oceans. CSIC scientific challenges towards 2030. 163–179. Available at: https://hal.archives-ouvertes.fr/hal-03372264/.
Lomeli M. J. M., Wakefield W. W. (2019). The effect of artificial illumination on Chinook salmon behavior and their escapement out of a midwater trawl bycatch reduction device. Fish Res. 218, 112–119. doi: 10.1016/j.fishres.2019.04.013
Lomeli M. J. M., Wakefield W. W., Herrmann B., Dykstra C. L., Simeon A., Rudy D. M., et al. (2021). Use of artificial illumination to reduce pacific halibut bycatch in a U.S. West coast groundfish bottom trawl. Fish Res. 233, 105737. doi: 10.1016/j.fishres.2020.105737
Long L., Johnson Z. V., Li J., Lancaster T. J., Aljapur V., Streelman J. T., et al. (2020). Automatic classification of cichlid behaviors using 3D convolutional residual networks. iScience 23, 101591. doi: 10.1016/j.isci.2020.101591
Lopez-Marcano S. ,. L., Jinks E., Buelow C. A., Brown C. J., Wang D., Kusy B., et al. (2021). Automatic detection of fish and tracking of movement for ecology. Ecol. Evol. 11, 8254–8263. doi: 10.1002/ece3.7656
Lucas S., Berggren P. (2022). A systematic review of sensory deterrents for bycatch mitigation of marine megafauna. Rev. Fish Biol. Fisheries 2022, 1–33. doi: 10.1007/S11160-022-09736-5
Lukas J., Romanczuk P., Klenz H., Klamser P., Arias Rodriguez L., Krause J., et al. (2021). Acoustic and visual stimuli combined promote stronger responses to aerial predation in fish. Behav. Ecol. 32, 1094–1102. doi: 10.1093/BEHECO/ARAB043
Lu H., Li Y., Zhang Y., Chen M., Serikawa S., Kim H. (2017). Underwater optical image processing: a comprehensive review. Mobile Networks Appl. 22, 1204–1211. doi: 10.1007/s11036-017-0863-4
Maia C. M., Volpato G. L. (2013). Environmental light color affects the stress response of Nile tilapia. Zoology 116, 64–66. doi: 10.1016/J.ZOOL.2012.08.001
Måløy H., Aamodt A., Misimi E. (2019). A spatio-temporal recurrent network for salmon feeding action recognition from underwater videos in aquaculture. Comput. Electron Agric. 167, 105087. doi: 10.1016/j.compag.2019.105087
Malde K., Handegard N. O., Eikvil L., Salberg A. B. (2020). Machine intelligence and the data-driven future of marine science. ICES J. Mar. Sci. 77, 1274–1285. doi: 10.1093/icesjms/fsz057
Mandralis I., Weber P., Novati G., Koumoutsakos P. (2021). Learning swimming escape patterns for larval fish under energy constraints. Phys. Rev. Fluids 6, 093101. doi: 10.1103/PhysRevFluids.6.093101
Manière G., Coureaud G. (2020). Editorial: From stimulus to behavioral decision-making. Frontiers in Behavioral Neuroscience 13, 274. doi: 10.3389/fnbeh.2019.00274
Marini S., Fanelli E., Sbragaglia V., Azzurro E., del Rio Fernandez J., Aguzzi J. (20182018). Tracking fish abundance by underwater image recognition. Sci. Rep. 1 (8), 1–12. doi: 10.1038/s41598-018-32089-8
Matt S. J. K., Broadhurst K., Kennelly S. J., Broadhurst M. K. (2021). A review of bycatch reduction in demersal fish trawls. Rev. Fish Biol. Fisheries 2 (31), 289–318. doi: 10.1007/s11160-021-09644-0
McClure E. C., Sievers M., Brown C. J., Buelow C. A., Ditria E. M., Hayes M. A., et al. (2020). Artificial intelligence meets citizen science to supercharge ecological monitoring. Patterns. 1 (7), 100109. doi: 10.1016/j.patter.2020.100109
McIntosh D., Marques T. P., Albu A. B., Rountree R., de Leo F. (2020). Movement tracks for the automatic detection of fish behavior in videos. arXiv preprint arXiv:2011.14070. doi: 10.48550/arXiv.2011.14070
Méhault S., Morandeau F., Simon J., Faillettaz R., Abangan A., Cortay A., et al. (2022). Using fish behavior to design a fish pot: Black seabream (Spondyliosoma cantharus) case study. Front. Mar. Sci. 9. doi: 10.3389/FMARS.2022.1009992
Mellody M. (2015) Robust methods for the analysis of images and videos for fisheries stock assessment: Summary of a workshop robust methods for the analysis of images and videos for fisheries stock assessment. Available at: https://nap.nationalacademies.org/catalog/18986/robust-methods-for-the-analysis-of-images-and-videos-for-fisheries-stock-assessment (Accessed June 29, 2022).
Millot S., Bégout M. L., Chatain B. (2009). Exploration behaviour and flight response toward a stimulus in three sea bass strains (Dicentrarchus labrax l.). Appl. Anim. Behav. Sci. 119, 108–114. doi: 10.1016/J.APPLANIM.2009.03.009
Mortensen L. O., Ulrich C., Olesen H. J., Bergsson H., Berg C. W., Tzamouranis N., et al. (2017). Effectiveness of fully documented fisheries to estimate discards in a participatory research scheme. Fish Res. 187, 150–157. doi: 10.1016/J.FISHRES.2016.11.010
Moustahfid H., Michaels W., Alger B., Gangopadhyay A., Brehmer P. (2020). “Advances in fisheries science through emerging observing technologies,” in Global Oceans 2020: Singapore – U.S. Gulf Coast, Biloxi, MS, USA. doi: 10.1109/IEEECONF38699.2020.9389452
Mujtaba D. F., Mahapatra N. R. (2022). Fish species classification with data augmentation. In 2021 International Conference on Computational Science and Computational Intelligence (CSCI). 1588–1593. doi: 10.1109/CSCI54926.2021.00307
Muñoz-Benavent P., Andreu-García G., Valiente-González J. M., Atienza-Vanacloig V., Puig-Pons V., Espinosa V. (2018). Automatic bluefin tuna sizing using a stereoscopic vision system. ICES J. Mar. Sci. 75, 390–401. doi: 10.1093/ICESJMS/FSX151
Murugaiyan J. S., Palaniappan M., Durairaj T., Muthukumar V. (2021). Fish species recognition using transfer learning techniques. Int. J. Adv. Intelligent Inf. 7, 188–197. doi: 10.26555/ijain.v7i2.610
Myrum E., Norstebo S. A., George S., Pedersen M., Museth J. (2019) An automatic image-based system for detecting wild and stocked fish (NIK: Norsk Informatikkonferanse). Available at: https://ntnuopen.ntnu.no/ntnu-xmlui/handle/11250/2639552 (Accessed July 15, 2022).
Nasreddine K., Benzinou A. (2015). “Shape-based fish recognition via shape space,” in 2015 23rd European Signal Processing Conference (EUSIPCO), Nice, France. 145–149. doi: 10.1109/EUSIPCO.2015.7362362
Nawi N. M., Atomi W. H., Rehman M. Z. (2013). The effect of data pre-processing on optimized training of artificial neural networks. Proc. Technol. 11, 32–39. doi: 10.1016/J.PROTCY.2013.12.159
Negahdaripour S. (2005). Calibration of DIDSON forward-scan acoustic video camera. Proc. MTS/IEEE OCEANS 2, 1287–1294. doi: 10.1109/OCEANS.2005.1639932
Nian R., He B., Yu J., Bao Z., Wang Y. (2013). ROV-based underwater vision system for intelligent fish ethology research. Int. J. Adv. Robot Syst. 10, 326. doi: 10.5772/56800
Niu B., Li G., Peng F., Wu J., Zhang L., Li Z. (2018). Survey of fish behavior analysis by computer vision. J. Aquac Res. Dev. 09, 1000534. doi: 10.4172/2155-9546.1000534
Noldus L. P. J. J., Spink A. J., Tegelenbosch R. A. J. (20012001). EthoVision: A versatile video tracking system for automation of behavioral experiments. Behav. Res. Methods Instruments Comput. 3 (33), 398–414. doi: 10.3758/BF03195394
O’Connell C. P., Stroud E. M., He P. (2014). The emerging field of electrosensory and semiochemical shark repellents: Mechanisms of detection, overview of past studies, and future directions. Ocean Coast. Manag 97, 2–11. doi: 10.1016/J.OCECOAMAN.2012.11.005
Odling-Smee L., Braithwaite V. A. (2003). The role of learning in fish orientation. Fish Fisheries 4, 235–246. doi: 10.1046/J.1467-2979.2003.00127.X
Okafor E., Schomaker L., Wiering M. A. (2018). An analysis of rotation matrix and colour constancy data augmentation in classifying images of animals. Journal of Information and Telecommunication 2, 465–491. doi: 10.1080/24751839.2018.1479932
Olla B. L., Davis M. W., Rose C. (2000). Differences in orientation and swimming of walleye pollock Theragra chalcogramma in a trawl net under light and dark conditions: concordance between field and laboratory observations. Fish Res. 44, 261–266. doi: 10.1016/S0165-7836(99)00093-4
O’Neill F. G., Feekings J., Fryer R. J., Fauconnet L., Afonso P. (2019). “Discard avoidance by improving fishing gear selectivity: Helping the fishing industry help itself,” in Uhlmann S., Ulrich C., Kennelly S. (eds) The European Landing Obligation. Springer, Cham. 279–296. doi: 10.1007/978-3-030-03308-8_14
O’Neill F. G., Mutch K. (2017). Selectivity in trawl fishing gears. Scottish Mar. Freshw. Sci. 8, 1–85. doi: 10.4789/1890-1
O’Neill F. G., Summerbell K., Barros L. (2018b) ICES WGFTFB 2018 report: Some recent trials with illuminated grids. Available at: https://archimer.ifremer.fr/doc/00586/69766/ (Accessed August 1, 2022).
Ordines F., Massutí E., Guijarro B., Mas R. (2006). Diamond vs. square mesh codend in a multi-species trawl fishery of the western Mediterranean: effects on catch composition, yield, size selectivity and discards. Aquat Living Resour 19, 329–338. doi: 10.1051/ALR:2007003
Oteiza P., Odstrcil I., Lauder G., Portugues R., Engert F. (2017). A novel mechanism for mechanosensory-based rheotaxis in larval zebrafish. Nature 547, 445–448. doi: 10.1038/nature23014
Ovchinnikova K., James M. A., Mendo T., Dawkins M., Crall J., Boswarva K. (2021). Exploring the potential to use low cost imaging and an open source convolutional neural network detector to support stock assessment of the king scallop (Pecten maximus). Ecol. Inform 62, 101233. doi: 10.1016/j.ecoinf.2021.101233
Owen M. A. G., Davies S. J., Sloman K. A. (2010). Light colour influences the behaviour and stress physiology of captive tench (Tinca tinca). Rev Fish Biol Fisheries 20, 375–380. doi: 10.1007/s11160-009-9150-1
Packard J. M., Folse L. J., Stone N. D., Makela M. E., Coulson R. N. (2021). Applications of artificial intelligence to animal behavior, in Bekoff M., Jamieson D. (eds), Interpretation and Explanation in the study of Animal Behavior 2, 147–191. doi: 10.4324/9780429042799-11
Painter K. J. (2021). The impact of rheotaxis and flow on the aggregation of organisms. J. R Soc. Interface 18, 20210582. doi: 10.1098/RSIF.2021.0582
Palazzo S., Murabito F. (2014). “Fish species identification in real-life underwater images,” in MAED 2014 - Proceedings of the 3rd ACM International Regular and Data Challenge Workshop on Multimedia Analysis for Ecological Data (Association for Computing Machinery), New York, NY, USA. 13–18. doi: 10.1145/2661821.2661822
Pang S., del Coz J. J., Yu Z., Luaces O., Díez J. (2017). Deep learning to frame objects for visual target tracking. Eng. Appl. Artif. Intell. 65, 406–420. doi: 10.1016/J.ENGAPPAI.2017.08.010
Pang J., Liu W., Liu B., Tao D., Zhang K., Lu X. (2022). “Interference distillation for underwater fish recognition,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 13188 LNCS. 62–74. doi: 10.1007/978-3-031-02375-0_5
Papadakis V. M., Papadakis I. E., Lamprianidou F., Glaropoulos A., Kentouri M. (2012). A computer-vision system and methodology for the analysis of fish behavior. Aquac Eng. 46, 53–59. doi: 10.1016/j.aquaeng.2011.11.002
Park Y., Dang L. M., Lee S., Han D., Moon H. (2021). Multiple object tracking in deep learning approaches: A survey. Electronics 10, 2406. doi: 10.3390/ELECTRONICS10192406
Pautsina A., Císař P., Štys D., Terjesen B. F., Espmark Å.M.O. (2015). Infrared reflection system for indoor 3D tracking of fish. Aquac Eng. 69, 7–17. doi: 10.1016/J.AQUAENG.2015.09.002
Pedersen M., Madsen N., Moeslund T. B. (2021). Video data in marine environments. J. Ocean Technol. 16, 21–30.
Pedersen M., Mohammed A. (2021). Photo identification of individual Salmo trutta based on deep learning. Appl. Sci. 11, 9039. doi: 10.3390/app11199039
Pelletier S., Montacir A., Zakari H., Akhloufi M. (2018). “Deep learning for marine resources classification in non-structured scenarios: Training vs. transfer learning,” in 2018 IEEE Canadian Conference on Electrical & Computer Engineering (CCECE), Quebec, QC, Canada. doi: 10.1109/CCECE.2018.8447682
Pereira T. D., Shaevitz J. W., Murthy M. (20202020). Quantifying behavior to understand the brain. Nat. Neurosci. 12 (23), 1537–1549. doi: 10.1038/s41593-020-00734-z
Peterson A. N. (2022) The persistent-pursuit and evasion strategies of lionfish and their prey. Available at: https://escholarship.org/uc/item/89n1p8wt (Accessed August 2, 2022).
Pieniazek R. H., Mickle M. F., Higgs D. M. (2020). Comparative analysis of noise effects on wild and captive freshwater fish behaviour. Anim. Behav. 168, 129–135. doi: 10.1016/j.anbehav.2020.08.004
Pietikäinen M., Hadid A., Zhao G., Ahonen T. (2011). Computer vision using local binary patterns. Springer Science & Business Media 40. doi: 10.1007/978-0-85729-748-8
Popoola O. P., Wang K. (2012). “Video-based abnormal human behavior recognitiona review,” in IEEE Transactions on Systems, Man and Cybernetics Part C: Applications and Reviews, 42. 865–878. doi: 10.1109/TSMCC.2011.2178594
Popper A. N., Carlson T. J. (1998). Application of sound and other stimuli to control fish behavior. Trans. Am. Fish Soc. 127, 673–707. doi: 10.1577/1548-8659(1998)127<0673:aosaos>2.0.co;2
Popper A. N., Hawkins A. D. (2019). An overview of fish bioacoustics and the impacts of anthropogenic sounds on fishes. J. Fish Biol. 94, 692–713. doi: 10.1111/JFB.13948
Pramunendar R. A., Wibirama S., Santosa P. I. (2019). “Fish classification based on underwater image interpolation and back-propagation neural network,” in 2019 5th International Conference on Science and Technology (ICST), Yogyakarta, Indonesia. 1–6. doi: 10.1109/ICST47872.2019.9166295
Prasetyo E., Suciati N., Fatichah C. (2021). Multi-level residual network VGGNet for fish species classification. J. King Saud Univ. - Comput. Inf. Sci. 34(8), 5286–5295. doi: 10.1016/j.jksuci.2021.05.015
Putland R. L., Mensinger A. F. (2019). Acoustic deterrents to manage fish populations. Rev. Fish Biol. Fish 29, 789–807. doi: 10.1007/s11160-019-09583-x
Pylatiuk C., Zhao H., Gursky E., Reischl M., Peravali R., Foulkes N., et al. (2019). DIY automated feeding and motion recording system for the analysis of fish behavior. SLAS Technol. 24, 394–398. doi: 10.1177/2472630319841412
Qian Z. M., Cheng X. E., Chen Y. Q. (2014). Automatically detect and track multiple fish swimming in shallow water with frequent occlusion. PloS One 9, e106506. doi: 10.1371/JOURNAL.PONE.0106506
Qin H., Li X., Liang J., Peng Y., Zhang C. (2016). DeepFish: Accurate underwater live fish recognition with a deep architecture. Neurocomputing 187, 49–58. doi: 10.1016/j.neucom.2015.10.122
Qiu C., Zhang S., Wang C., Yu Z., Zheng H., Zheng B. (2018). Improving transfer learning and squeeze- and-excitation networks for small-scale fine-grained fish image classification. IEEE Access 6, 78503–78512. doi: 10.1109/ACCESS.2018.2885055
Rasheed J. (2021). “A sustainable deep learning based computationally intelligent seafood monitoring system for fish species screening,” in 2021 International Conference on Artificial Intelligence of Things (ICAIoT), Nicosia, Turkey. 1–6. doi: 10.1109/ICAIOT53762.2021.00008
Rathi D., Jain S., Indu S. (2017) Underwater fish species classification using convolutional neural network and deep learning. Available at: http://dhruvrathi.me/http://www.dtu.ac.in/Web/Departments/Electronics/faculty/sindu.php.
Ravanbakhsh M., Shortis M. R., Shafait F., Mian A., Harvey E. S., Seager J. W. (2015). Automated fish detection in underwater images using shape-based level sets. Photogrammetric Rec. 30, 46–62. doi: 10.1111/phor.12091
Raymond E. H., Widder E. A. (2007). Behavioral responses of two deep-sea fish species to red, far-red, and white light. Mar. Ecol. Prog. Ser. 350, 291–298. doi: 10.3354/MEPS07196
Raza K., Hong S. (2020). Fast and accurate fish detection design with improved YOLO-v3 model and transfer learning. Int. J. Advanced Comput. Sci. Appl. 11(2), 7–16. doi: 10.14569/ijacsa.2020.0110202
Redmon J., Divvala S., Girshick R., Farhadi A. (2016). “You only look once: Unified, real-time object detection,” in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA. 779–788. doi: 10.1109/CVPR.2016.91
Ribeiro M. T., Singh S., Guestrin C. (2016). “Why should i trust you?” explaining the predictions of any classifier,” in Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (Association for Computing Machinery. 1135–1144. doi: 10.1145/2939672.2939778
Rieucau G., de Robertis A., Boswell K. M., Handegard N. O. (2014). School density affects the strength of collective avoidance responses in wild-caught Atlantic herring Clupea harengus: a simulated predator encounter experiment. J. Fish Biol. 85, 1650–1664. doi: 10.1111/jfb.12520
Robbins W. D., Peddemors V. M., Kennelly S. J. (2011). Assessment of permanent magnets and electropositive metals to reduce the line-based capture of Galapagos sharks, Carcharhinus galapagensis. Fish Res. 109, 100–106. doi: 10.1016/J.FISHRES.2011.01.023
Robert M., Cortay A., Morfin M., Simon J., Morandeau F., Deneubourg J. L., et al. (2020). A methodological framework for characterizing fish swimming and escapement behaviors in trawls. PloS One 15, e0243311. doi: 10.1371/journal.pone.0243311
Robinson K. L., Luo J. Y., Sponaugle S., Guigand C., Cowen R. K. (2017). A tale of two crowds: Public engagement in plankton classification. Front. Mar. Sci. 4. doi: 10.3389/FMARS.2017.00082/BIBTEX
Rosen S., Holst J. C. (2013). DeepVision in-trawl imaging: Sampling the water column in four dimensions. Fish Res. 148, 64–73. doi: 10.1016/J.FISHRES.2013.08.002
Rosen S., Jörgensen T., Hammersland-White D., Holst J. C. (2013). DeepVision: A stereo camera system provides highly accurate counts and lengths of fish passing inside a trawl. Can. J. Fisheries Aquat. Sci. 70, 1456–1467. doi: 10.1139/CJFAS-2013-0124/SUPPL_FILE/CJFAS-2013-0124SUPPLJ.TIF
Rose C. S., Stoner A. W., Matteson K. (2005). Use of high-frequency imaging sonar to observe fish behaviour near baited fishing gears. Fish Res. 76, 291–304. doi: 10.1016/J.FISHRES.2005.07.015
Rudstam L. G., Magnuson J. J., Tonn W. M. (2011). Size selectivity of passive fishing gear: A correction for encounter probability applied to gill nets, Canadian Journal of Fisheries and Aquatic Sciences. 41, 1252–1255. doi: 10.1139/F84-151
Ruokonen T. J., Keskinen T., Luoma M., Leskelä A., Suuronen P. (2021). The effect of LED lights on trap catches in Finnish inland fisheries. Fish Manag Ecol. 28, 211–218. doi: 10.1111/fme.12482
Ryer C. H., Barnett L. A. K. (2006). Influence of illumination and temperature upon flatfish reactivity and herding behavior: Potential implications for trawl capture efficiency. Fisheries Research 81, 242–250. doi: 10.1016/J.FISHRES.2006.07.001
Ryer C. H., Olla B. L. (2000). Avoidance of an approaching net by juvenile walleye pollock theragra chalcogramma in the laboratory: The influence of light intensity. Fish Res. 45, 195–199. doi: 10.1016/S0165-7836(99)00113-7
Ryer C. H., Rose C. S., Iseri P. J. (2010). Flatfish herding behavior in response to trawl sweeps: a comparison of diel responses to conventional sweeps and elevated sweeps. Fishery Bull. 108, 145–154.
Saleh A., Laradji I. H., Konovalov D. A., Bradley M., Vazquez D., Sheaves M. (2020). A realistic fish-habitat dataset to evaluate algorithms for underwater visual analysis. Sci Rep. 10, 14671. doi: 10.1038/s41598-020-71639-x
Salman A., Jalal A., Shafait F., Mian A., Shortis M., Seager J., et al. (2016). Fish species classification in unconstrained underwater environments based on deep learning. Limnol Oceanogr Methods 14, 570–585. doi: 10.1002/lom3.10113
Salman A., Maqbool S., Khan A. H., Jalal A., Shafait F. (2019). Real-time fish detection in complex backgrounds using probabilistic background modelling. Ecol. Inform 51, 44–51. doi: 10.1016/j.ecoinf.2019.02.011
Salman A., Siddiqui S. A., Shafait F., Mian A., Shortis M. R., Khurshid K., et al. (2020). Automatic fish detection in underwater videos by a deep neural network-based hybrid motion learning system. ICES J. Mar. Sci. 77, 1295–1307. doi: 10.1093/icesjms/fsz025
Santos J., Herrmann B., Otero P., Fernandez J., Pérez N. (2016). Square mesh panels in demersal trawls: does lateral positioning enhance fish contact probability? Aquat Living Resour 29, 302. doi: 10.1051/ALR/2016025
Santos J., Herrmann B., Stepputtis D., Kraak S. B. M., Gökçe G., Mieske B. (2020). Quantifying the performance of selective devices by combining analysis of catch data and fish behaviour observations: Methodology and case study on a flatfish excluder. ICES Journal of Marine Science 77, 2840–2856. doi: 10.1093/ICESJMS/FSAA155
Sarriá D., Rio J., del, Lázaro A. M., Aguzzi J., del Río J., et al. (2009). Studying the behaviour of Norway lobster using RFID and infrared tracking technologies. OCEANS 2009-EUROPE, Bremen, Germany, pp. 1–4. doi: 10.1109/OCEANSE.2009.5278280
Sawada K., Takahashi H., Takao Y., Watanabe K., Horne J. K., McClatchie S., et al. (2004). “Development of an acoustic-optical system to estimate target-strengths and tilt angles from fish aggregations,” in Ocean ‘04 - MTS/IEEE Techno-Ocean ‘04: Bridges across the Oceans - Conference Proceedings, Kobe, Japan. 1. 395–400. doi: 10.1109/OCEANS.2004.1402949
Schaerf T. M., Dillingham P. W., Ward A. J. W. (2017). The effects of external cues on individual and collective behavior of shoaling fish. Sci. Adv. 3, e1603201. doi: 10.1126/sciadv.1603201
Schwarz A. L. (1985). The behavior of fishes in their acoustic environment. Environ Biol Fish 13, 3–15. doi: doi : 10.1007/BF00004851.
Schwarz A. L., Greer G. L. (2011). Responses of pacific herring, Clupea harengus pallasi, to some underwater sounds. Canadian Journal of Fisheries and Aquatic Sciences 41, 1183–1192. doi: 10.1139/F84-140
Sengupta E., Garg D., Choudhury T., Aggarwal A. (2018). “Techniques to elimenate human bias in machine learning,” in 2018 International Conference on System Modeling & Advancement in Research Trends (SMART), Moradabad, India. 226–230. doi: 10.1109/SYSMART.2018.8746946
Shafait F., Mian A., Shortis M., Ghanem B., Culverhouse P. F., Edgington D., et al. (2016). Fish identification from videos captured in uncontrolled underwater environments. ICES J. Mar. Sci. 73, 2737–2746. doi: 10.1093/icesjms/fsw106
Shah S. Z. H., Rauf H. T., lali I., Bukhari S. A. C., Khalid M. S., Farooq M., et al. (2019). Fish-pak: Fish species dataset from Pakistan for visual features based classification. Data in brief 27, 104565. doi: 10.17632/N3YDW29SBZ.3
Sharber N. G., Carothers J. P., Sharber J. C., de Vos J. R., DA H. (1994). Reducing Electrofishing‐Induced injury of rainbow trout. N Am. J. Fish Manag 14, 340–346. doi: 10.1577/1548-8675(1994)014%3C0340:REIIOR%3E2.3.CO;2
Shaw R. F. (2004). Arithmetic operations in a binary computer. Rev. Sci. Instruments 21, 687. doi: 10.1063/1.1745692
Shcherbakov D., Knörzer A., Hilbig R., Haas U., Blum M. (2012). Near-infrared orientation of Mozambique tilapia Oreochromis mossambicus. Zoology 115, 233–238. doi: 10.1016/J.ZOOL.2012.01.005
Siddiqui S. A., Salman A., Malik M. I., Shafait F., Mian A., Shortis M. R., et al. (2018). Automatic fish species classification in underwater videos: Exploiting pre-trained deep neural network models to compensate for limited labelled data. ICES J. Mar. Sci. 75, 374–389. doi: 10.1093/icesjms/fsx109
Simon J., Kopp D., Larnaud P., Vacherot J. P., Morandeau F., Lavialle G., et al. (2020). Using automated video analysis to study fish escapement through escape panels in active fishing gears: Application to the effect of net colour. Mar. Policy 116, 103785. doi: 10.1016/j.marpol.2019.103785
Simons T., Lee D. J. (2021). Efficient binarized convolutional layers for visual inspection applications on resource-limited FPGAs and ASICs. Electronics 10, 1511. doi: 10.3390/ELECTRONICS10131511
Sinhuber M., van der Vaart K., Ni R., Puckett J. G., Kelley D. H., Ouellette N. T. (2019). Three-dimensional time-resolved trajectories from laboratory insect swarms. Sci. Data 6, 1–8. doi: 10.1038/sdata.2019.36
Skinner B. F. (2010). The generic nature of the concepts of stimulus and response. The Journal of General Psychology 12, 40–65. doi: 10.1080/00221309.1935.9920087
Sokolova M., Mompó Alepuz A., Thompson F., Mariani P., Galeazzi R., Krag L. A. (2021). A deep learning approach to assist sustainability of demersal trawling operations. Sustainability 13, 12362. doi: 10.3390/su132212362
Southworth L. K., Ratcliffe F. C., Bloor I. S. M., Emmerson J., Watson D., Beard D., et al. (2020). Artificial light improves escapement of fish from a trawl net. J. Mar. Biol. Assoc. United Kingdom 100, 267–275. doi: 10.1017/S0025315420000028
Spampinato C., Giordano D., di Salvo R., Chen-Burger Y. H., Fisher R. B., Nadarajan G. (2010). “Automatic fish classification for underwater species behavior understanding,” in ARTEMIS’10 - Proceedings of the 1st ACM Workshop on Analysis and Retrieval of Tracked Events and Motion in Imagery Streams, Firenze, Italy. 45–50. doi: 10.1145/1877868.1877881
Spampinato C., Palazzo S., Boom B., van Ossenbruggen J., Kavasidis I., di Salvo R., et al. (2014). Understanding fish behavior during typhoon events in real-life underwater environments. Multimed Tools Appl. 70, 199–236. doi: 10.1007/s11042-012-1101-5
Spangler G., Collins J. (2011). Lake Huron fish community structure based on gill-net catches corrected for selectivity and encounter probability. North Am. J. Fisheries Manage. 12 (3), 585–597.
Stewart P. A. M. (2001). A review of studies of fishing gear selectivity in the meditteranean. FAO COPEMED Report No. 9, Aberdeen, UK, pp 57.
Stienessen S. C., Parrish J. K. (2013). The effect of disparate information on individual fish movements and emergent group behavior. Behav. Ecol. 24, 1150–1160. doi: 10.1093/BEHECO/ART042
Stuart I. G., Zampatti B. P., Baumgartner L. J. (2008). Can a low-gradient vertical-slot fishway provide passage for a lowland river fish community? Mar. Freshw. Res. 59, 332–346. doi: 10.1071/MF07141
Sung M., Yu S. C., Girdhar Y. (2017). “Vision based real-time fish detection using convolutional neural network,” in OCEANS 2017 - Aberdeen, Aberdeen, UK. 1–6. doi: 10.1109/OCEANSE.2017.8084889
Torres A., Abril A. M., Clua E. E. G. (2020). A time-extended (24 h) baited remote underwater video (BRUV) for monitoring pelagic and nocturnal marine species. J. Mar. Sci. Eng. 8, 208. doi: 10.3390/jmse8030208
Underwood M. J., Utne Palm A. C., Øvredal J. T., Bjordal Å. (2021). The response of mesopelagic organisms to artificial lights. Aquaculture and Fisheries 6, 519–529. doi: 10.1016/j.aaf.2020.05.002
Valletta J. J., Torney C., Kings M., Thornton A., Madden J. (2017). Applications of machine learning in animal behaviour studies. Anim. Behav. 124, 203–220. doi: 10.1016/j.anbehav.2016.12.005
van Gerven M., Bohte S. (2017). Editorial: Artificial neural networks as models of neural information processing. Front. Comput. Neurosci. 11. doi: 10.3389/fncom.2017.00114
Vaswani A., Brain G., Shazeer N., Parmar N., Uszkoreit J., Jones L., et al. (2017). Attention is all you need. In Advances in Neural Information Processing Systems. 5998–6008.
Villon S., Iovan C., Mangeas M., Claverie T., Mouillot D., Villéger S., et al. (2021). Automatic underwater fish species classification with limited data using few-shot learning. Ecol. Inform 63, 101320. doi: 10.1016/j.ecoinf.2021.101320
Vinicius P., Borges K., Conci N., Cavallaro A. (2013). Video-based human behavior understanding: A survey. IEEE Transactions on Circuits and Systems for Video Technology 23, 1993–2008. doi: 10.1109/TCSVT.2013.2270402
Viscido S. V., Parrish J. K., Grünbaum D. (2004). Individual behavior and emergent properties of fish schools: a comparison of observation and theory. Mar. Ecol. Prog. Ser. 273, 239–249. doi: 10.3354/MEPS273239
Vogel C. (2016) Sélectivité des engins de pêche. Available at: https://archimer.ifremer.fr/doc/00317/42869/ (Accessed June 29, 2022).
Walsh S. J., Engås A., Ferro R., Fonteyne R., van Marlen Walsh B., Ferro A. R. (2002). To catch or conserve more fish: the evolution of fishing technology in fisheries science. ICES Marine Science Symposia. Report. doi: 10.17895/ices.pub.8887
Walsh S. J., Godø O. R., Michalsen K. (2004). Fish behaviour relevant to fish catchability. ICES J. Mar. Sci. 61, 1238–1239. doi: 10.1016/J.ICESJMS.2004.08.004
Wang J. H., Lee S. K., Lai Y. C., Lin C. C., Wang T. Y., Lin Y. R., et al. (2020). Anomalous behaviors detection for underwater fish using AI techniques. IEEE Access 8, 224372–224382. doi: 10.1109/ACCESS.2020.3043712
Wang G., Muhammad A., Liu C., Du L., Li D., Automatic D., et al. (2021). Automatic recognition of fish behavior with a fusion of RGB and optical flow data based on deep learning recognition of fish behavior with a fusion of RGB and optical flow data based on deep learning. Animals. 11, 2274. doi: 10.3390/ani11102774
Wang X., Ouyang J., Li D., Zhang G. (2019). Underwater object recognition based on deep encoding-decoding network. J. Ocean Univ. China 18, 376–382. doi: 10.1007/s11802-019-3858-x
Watson J. (2013). “Subsea imaging and vision: An introduction,” in Watson J., Zielinski O. (eds.), Subsea optics and imaging Amsterdam, Netherlands: Elsevier, 17–34. doi: 10.1533/9780857093523.1.17
Watson J. W., Kerstetter D. W. (2006). Pelagic longline fishing gear: A brief history and review of research efforts to improve selectivity. Mar. Technol. Soc. J. 40, 6. doi: 10.4031/002533206787353259
Wei Y., Duan Y., An D. (2022). Monitoring fish using imaging sonar: Capacity, challenges and future perspective. Fish and Fisheries 23(6), 1347–1370.
Weissburg M. J. (2016). The fluid dynamical context of chemosensory behavior 198, 188–202. doi: 10.2307/1542523
Widder E. A., Robison B. H., Reisenbichler K. R., Haddock S. H. D. (2005). Using red light for in situ observations of deep-sea fishes. Deep Sea Res. 1 Oceanogr Res. Pap 52, 2077–2085. doi: 10.1016/J.DSR.2005.06.007
Williams K., Lauffenburger N., Chuang M.-C., Hwang J.-N., Towler R. (2016). Automated measurements of fish within a trawl using stereo images from a camera-trawl device (CamTrawl). Methods Oceanography 17, 138–152. doi: 10.1016/j.mio.2016.09.008
Wu Z., Hristov N. I., Kunz T. H., Betke M. (2009). Tracking-reconstruction or reconstruction-tracking? doi: 10.1109/WMVC.2009.5399245. comparison of two multiple hypothesis tracking approaches to Interpret 3D Object Motion from Several Camera Views. 2009 Workshop on Motion and Video Computing, WMVC ‘09.
Xia M., Chen X., Lang H., Shao H., Williams D., Gazivoda M., et al. (20222022). Features and always-on wake-up detectors for sparse acoustic event detection. Electronics 11, 478. doi: 10.3390/ELECTRONICS11030478
Xu Y., Liu X., Cao X., Huang C., Liu E., Qian S., et al. (2021). Artificial intelligence: A powerful paradigm for scientific research. Innovation 2, 100179. doi: 10.1016/J.XINN.2021.100179
Xu W., Matzner S. (2018). “Underwater fish detection using deep learning for water power applications,” in Proceedings - 2018 International Conference on Computational Science and Computational Intelligence, CSCI 2018, Las Vegas, NV, USA. 313–318. doi: 10.1109/CSCI46756.2018.00067
Xu J., Sang W., Dai H., Lin C., Ke S., Mao J., et al. (2022). A detailed analysis of the effect of different environmental factors on fish phototactic behavior: Directional fish guiding and expelling technique. Animals : an Open Access Journal from MDPI, [online] 12 (3), p.240. doi: 10.3390/ani12030240
Yan H. Y., Anraku K., Babaran R. P. (2010). Hearing in marine fish and its application in fisheries. Behav. Mar. Fishes: Capture Processes Conserv. Challenges 45–64. doi: 10.1002/9780813810966.CH3
Yan Z., Bi Y., Xue B., Zhang M. (2021). “Automatically extracting features using genetic programming for low-quality fish image classification,” in 2021 IEEE Congress on Evolutionary Computation, CEC 2021 - Proceedings, Kraków, Poland., 2015–2022. doi: 10.1109/CEC45853.2021.9504737
Yang L., Liu Y., Yu H., Fang X., Song L., Li D., et al. (2020). Computer Vision Models in Intelligent Aquaculture with Emphasis on Fish Detection and Behavior Analysis: A review. Arch. Comput. Methods Eng. 28, 2785–2816. doi: 10.1007/s11831-020-09486-2
Yang X., Zhang S., Liu J., Gao Q., Dong S., Zhou C. (2021b). Deep learning for smart fish farming: applications, opportunities and challenges. Rev. Aquac 13 (1), 66–90. doi: 10.1111/raq.12464
Yochum N., Stone M., Breddermann K., Berejikian B. A., Gauvin J. R., Irvine D. J. (2021). Evaluating the role of bycatch reduction device design and fish behavior on pacific salmon (Oncorhynchus spp.) escapement rates from a pelagic trawl. Fish Res. 236, 105830. doi: 10.1016/J.FISHRES.2020.105830
York R. A., Patil C., Darrin Hulsey C., Anoruo O., Streelman J. T., Fernald R. D. (2015). Evolution of bower building in lake Malawi cichlid fish: Phylogeny, morphology, and behavior. Front. Ecol. Evol. 3. doi: 10.3389/fevo.2015.00018
Yuan H., Zhang S., Chen G., Yang Y. (2020). Underwater image fish recognition technology based on transfer learning and image enhancement. J. Coast. Res. 105, 124–128. doi: 10.2112/JCR-SI105-026.1
Yu X., Wang Y., An D., Wei Y. (2021). Identification methodology of special behaviors for fish school based on spatial behavior characteristics. Comput. Electron Agric. 185, 106169. doi: 10.1016/j.compag.2021.106169
Zhang L., Li W., Liu C., Zhou X., Duan Q. (2020). Automatic fish counting method using image density grading and local regression. Comput. Electron Agric. 179, 105844. doi: 10.1016/J.COMPAG.2020.105844
Zhao J., Bao W., Zhang F., Zhu S., Liu Y., Lu H., et al. (2018). Modified motion influence map and recurrent neural network-based monitoring of the local unusual behaviors for fish school in intensive aquaculture. Aquaculture 493, 165–175. doi: 10.1016/J.AQUACULTURE.2018.04.064
Zhou C., Xu D., Chen L., Zhang S., Sun C., Yang X., et al. (2019). Evaluation of fish feeding intensity in aquaculture using a convolutional neural network and machine vision. Aquaculture 507, 457–465. doi: 10.1016/J.AQUACULTURE.2019.04.056
Zhou C., Zhang B., Lin K., Xu D., Chen C., Yang X., et al. (2017). Near-infrared imaging to quantify the feeding behavior of fish in aquaculture. Comput. Electron Agric. 135, 233–241. doi: 10.1016/j.compag.2017.02.013
Keywords: fisheries, gear technology, underwater observation systems, deep learning, fish behavior tracking
Citation: Abangan AS, Kopp D and Faillettaz R (2023) Artificial intelligence for fish behavior recognition may unlock fishing gear selectivity. Front. Mar. Sci. 10:1010761. doi: 10.3389/fmars.2023.1010761
Received: 03 August 2022; Accepted: 31 January 2023;
Published: 23 February 2023.
Edited by:
Lyne Morissette, M–Expertise Marine, CanadaReviewed by:
Abdullah-Al Arif, Yokohama City University, JapanHongsheng Bi, College Park, United States
Copyright © 2023 Abangan, Kopp and Faillettaz. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Robin Faillettaz, robin.faillettaz@ifremer.fr