AUTHOR=Zhu Qingpeng , Triesch Jochen , Shi Bertram E. TITLE=Joint Learning of Binocularly Driven Saccades and Vergence by Active Efficient Coding JOURNAL=Frontiers in Neurorobotics VOLUME=11 YEAR=2017 URL=https://www.frontiersin.org/journals/neurorobotics/articles/10.3389/fnbot.2017.00058 DOI=10.3389/fnbot.2017.00058 ISSN=1662-5218 ABSTRACT=

This paper investigates two types of eye movements: vergence and saccades. Vergence eye movements are responsible for bringing the images of the two eyes into correspondence, whereas saccades drive gaze to interesting regions in the scene. Control of both vergence and saccades develops during early infancy. To date, these two types of eye movements have been studied separately. Here, we propose a computational model of an active vision system that integrates these two types of eye movements. We hypothesize that incorporating a saccade strategy driven by bottom-up attention will benefit the development of vergence control. The integrated system is based on the active efficient coding framework, which describes the joint development of sensory-processing and eye movement control to jointly optimize the coding efficiency of the sensory system. In the integrated system, we propose a binocular saliency model to drive saccades based on learned binocular feature extractors, which simultaneously encode both depth and texture information. Saliency in our model also depends on the current fixation point. This extends prior work, which focused on monocular images and saliency measures that are independent of the current fixation. Our results show that the proposed saliency-driven saccades lead to better vergence performance and faster learning in the overall system than random saccades. Faster learning is significant because it indicates that the system actively selects inputs for the most effective learning. This work suggests that saliency-driven saccades provide a scaffold for the development of vergence control during infancy.