
94% of researchers rate our articles as excellent or good
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.
Find out more
ORIGINAL RESEARCH article
Front. Comput. Sci.
Sec. Computer Vision
Volume 6 - 2024 | doi: 10.3389/fcomp.2024.1369887
The final, formatted version of the article will be published soon.
You have multiple emails registered with Frontiers:
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Conventional object detection algorithms are limited by the availability of training and testing samples in the target domain. To alleviate the dependency on the target domain and reduce the burden of annotation while ensuring detection performance, this paper proposes the use of an IF-Net (Information Invariant Scaling and Feature Alignment Module Net) framework to extract feature information from unconstrained objects. Firstly, an Information Invariant Scaling (IIS) Module is proposed, which introduces multiple layers of information at different depths of the network, greatly improving the detection performance of the network. Secondly, a Feature Alignment Module (FAM) is introduced, which trains target boxes using samples from the source domain and aligns different features to enhance the network's performance and reduce its reliance on a large number of target domain samples. Finally, a novel loss function, ℒ IF , is proposed to further enhance the detection performance and accuracy of the network. Experimental studies conducted on three publicly available datasets demonstrate significant advantages of our framework compared to state-of-the-art algorithms.
Keywords: image processing, object detection, IIS Module, FAM Module, SOTA algorithm
Received: 15 Feb 2024; Accepted: 13 Nov 2024.
Copyright: © 2024 Ding, Sun, Li and Sun. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence:
Gang Sun, Fuyang Normal University, Fuyang, China
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
Research integrity at Frontiers
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.