Event Abstract

Mapping the Navigational Information Content of Insect Habitats

  • 1 German Aerospace Center (DLR), Germany
  • 2 Australian National University, Australia

For developing and validating models of insect navigation it is essential to identify the visual input insects experience in their natural habitats. Here we report on the development of methods to reconstruct what insects see when making navigational decisions and critically assess the current limitations of such methods.
We used a laser-range finder as well as camera-based methods to capture the 3D structure and the appearance of outdoor environments. Both approaches produce coloured point clouds that allow within the model scale the reconstruction of views at defined positions and orientations. For instance, we filmed bees and wasps with a high-speed stereo camera system to estimate their 3D flight paths and gaze direction. The high-speed system is registered with a 3D model of the same environment, such that panoramic images can be rendered along the insects’ flight paths (see accompanying abstract “Benchmark 3D-models of natural navigation environments @ www.InsectVision.org” by Mair et al.).
The laser-range finder (see figure A) is equipped with a rotating camera that provides colour information for the measured 3D points. This system is robust and easy-to-use in the field generating high resolution data (about 50 × 106 points) with large field of view, up to a distance of 80 m at typical acquisition times of about 8 minutes. However, a large number of scans at different locations has to be recorded and registered to account for occlusions.
In comparison, data-acquisition in camera-based reconstruction from multiple view-points is fast, but model generation is computationally more complex due to bundle adjustment and dense pair-wise stereo computation (see figure B, C for views rendered from a 3D model based on 6 image pairs). In addition it is non-trivial and often time-consuming in the field to ensure the acquisition of sufficient information. We are currently developing the tools that will allow us to combine the results of laser-scanner and camera-based 3D reconstruction methods.

Figure 1

Keywords: view reconstrcution, methods development, 3D model acquisition, simulation, view rendering

Conference: International Conference on Invertebrate Vision, Fjälkinge, Sweden, 1 Aug - 8 Aug, 2013.

Presentation Type: Oral presentation preferred

Topic: Navigation and orientation

Citation: Stürzl W, Mair E, Hirschmüller H and Zeil J (2019). Mapping the Navigational Information Content of Insect Habitats. Front. Physiol. Conference Abstract: International Conference on Invertebrate Vision. doi: 10.3389/conf.fphys.2013.25.00085

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 26 Feb 2013; Published Online: 09 Dec 2019.

* Correspondence: Dr. Wolfgang Stürzl, German Aerospace Center (DLR), Wessling, Germany, wolfgang.stuerzl@dlr.de