AUTHOR=Liu Wei-Liang , Wang Yuhling , Chen Yu-Xuan , Chen Bo-Yu , Lin Arvin Yi-Chu , Dai Sheng-Tong , Chen Chun-Hong , Liao Lun-De
TITLE=An IoT-based smart mosquito trap system embedded with real-time mosquito image processing by neural networks for mosquito surveillance
JOURNAL=Frontiers in Bioengineering and Biotechnology
VOLUME=11
YEAR=2023
URL=https://www.frontiersin.org/journals/bioengineering-and-biotechnology/articles/10.3389/fbioe.2023.1100968
DOI=10.3389/fbioe.2023.1100968
ISSN=2296-4185
ABSTRACT=
An essential aspect of controlling and preventing mosquito-borne diseases is to reduce mosquitoes that carry viruses. We designed a smart mosquito trap system to reduce the density of mosquito vectors and the spread of mosquito-borne diseases. This smart trap uses computer vision technology and deep learning networks to identify features of live Aedes aegypti and Culex quinquefasciatus in real-time. A unique mechanical design based on the rotation concept is also proposed and implemented to capture specific living mosquitoes into the corresponding chambers successfully. Moreover, this system is equipped with sensors to detect environmental data, such as CO2 concentration, temperature, and humidity. We successfully demonstrated the implementation of such a tool and paired it with a reliable capture mechanism for live mosquitos without destroying important morphological features. The neural network achieved 91.57% accuracy with test set images. When the trap prototype was applied in a tent, the accuracy rate in distinguishing live Ae. aegypti was 92%, with a capture rate reaching 44%. When the prototype was placed into a BG trap to produce a smart mosquito trap, it achieved a 97% recognition rate and a 67% catch rate when placed in the tent. In a simulated living room, the recognition and capture rates were 90% and 49%, respectively. This smart trap correctly differentiated between Cx. quinquefasciatus and Ae. aegypti mosquitoes, and may also help control mosquito-borne diseases and predict their possible outbreak.