For many animals, vision is a key sense for survival. However, vision becomes challenging in fog, which is a result of intensified scattering when the sizes of particles suspended in the medium approach or exceed the wavelength of light. Fog occludes important visual information by attenuating the available light, reducing the overall contrast, and limiting the visibility in the scene. For some animals, fog is episodic and only temporarily interferes with diurnal activities. For others, like those endemic to cloud forests, it is a persistent condition that defines their visual world. The situation is even more nuanced in natural bodies of water where colored fog is ubiquitous, but its thickness varies geographically, temporally, and as a function of viewing distance. Different populations of many species of marine animals including fish, turtles, and scallops (which have image-forming visual systems) can be found both in crystal clear waters and extremely turbid habitats, which pose unique challenges to their vision.
With this Research Topic, our goal is to create the most comprehensive body of knowledge regarding the ecology and evolution of animal vision in limited visibility conditions. We want to investigate questions like: What is the nature of fog? What are the optical characteristics of light fields and visibility ranges in natural environments such as cloud forests and oceans? Do existing mathematical models of vision accurately reflect the physics of sensing in fog/turbidity? What challenges must animal visual systems overcome to successfully find food, acquire a mate, and avoid predation in the presence of fog or aquatic turbidity? What unique visual adaptations have evolved in response to challenging visibility conditions? What is the genetic and physiological basis for these adaptations and do they show intra- or inter-specific differences depending on the prevalence of fog/turbidity?
Better understanding vision in limited visibility can have an impact that will reach far beyond evolution and visual ecology: it can inspire the development of biomimetic cameras with unique sensor-microlens combinations, as well as imaging algorithms customized for different visibility states, having immediate applications for self-driving cars or ocean exploration. Thus, we invite researchers from all disciplines (including, but not limited to: ecology, evolution, biology, neuroscience, vision, computer science, optics, engineering, and remote sensing) to team up and combine hypotheses, methods, and data in an evolutionary framework; as well as naturalists and nature photographers to submit data papers that contribute imagery describing light/visibility conditions in cloud forests/underwater habitats, as well as their biodiversity.
• Sensory adaptations for better vision in attenuating media (spectral sensitivity, polarization vision, colored ocular filters, oil droplets, eye designs, etc.)
• Experiments of behavior and/or visual perception in attenuating media
• Visual neuroscience/neurobiology, e.g. dehazing computations in the retina
• Inter- and intra-species comparisons of animals living in episodic versus ever-present fog
• Cone catch and image formation models for vision in attenuating media
• Visibility limitations due to visual impairment/deficits of the individual
• Quantitative measurements of the photic environments of cloud forests and natural water bodies (e.g., intensity, spectrum, polarization, 3D distribution of light, optical thickness of fog/clouds etc.)
• Qualitative photographic collections featuring endemic species, life and light visual environments in cloud forests, turbid underwater habitats
• 3D Computer simulations of complex natural habitats under fog (terrestrial or aquatic)
If you are unsure about the suitability of your manuscript to this Research Topic, please do not hesitate to contact the Editors.
Topic Editor Michael Webster received a research contract from Johnson and Johnson Optical Division. Derya Akkaynak holds several patents but none related to visibility enhancement (‘Sea-thru’, whose patent is pending, is a color enhancement algorithm). The other Topic Editors declare no potential conflicts of interest related to this Research Topic.
For many animals, vision is a key sense for survival. However, vision becomes challenging in fog, which is a result of intensified scattering when the sizes of particles suspended in the medium approach or exceed the wavelength of light. Fog occludes important visual information by attenuating the available light, reducing the overall contrast, and limiting the visibility in the scene. For some animals, fog is episodic and only temporarily interferes with diurnal activities. For others, like those endemic to cloud forests, it is a persistent condition that defines their visual world. The situation is even more nuanced in natural bodies of water where colored fog is ubiquitous, but its thickness varies geographically, temporally, and as a function of viewing distance. Different populations of many species of marine animals including fish, turtles, and scallops (which have image-forming visual systems) can be found both in crystal clear waters and extremely turbid habitats, which pose unique challenges to their vision.
With this Research Topic, our goal is to create the most comprehensive body of knowledge regarding the ecology and evolution of animal vision in limited visibility conditions. We want to investigate questions like: What is the nature of fog? What are the optical characteristics of light fields and visibility ranges in natural environments such as cloud forests and oceans? Do existing mathematical models of vision accurately reflect the physics of sensing in fog/turbidity? What challenges must animal visual systems overcome to successfully find food, acquire a mate, and avoid predation in the presence of fog or aquatic turbidity? What unique visual adaptations have evolved in response to challenging visibility conditions? What is the genetic and physiological basis for these adaptations and do they show intra- or inter-specific differences depending on the prevalence of fog/turbidity?
Better understanding vision in limited visibility can have an impact that will reach far beyond evolution and visual ecology: it can inspire the development of biomimetic cameras with unique sensor-microlens combinations, as well as imaging algorithms customized for different visibility states, having immediate applications for self-driving cars or ocean exploration. Thus, we invite researchers from all disciplines (including, but not limited to: ecology, evolution, biology, neuroscience, vision, computer science, optics, engineering, and remote sensing) to team up and combine hypotheses, methods, and data in an evolutionary framework; as well as naturalists and nature photographers to submit data papers that contribute imagery describing light/visibility conditions in cloud forests/underwater habitats, as well as their biodiversity.
• Sensory adaptations for better vision in attenuating media (spectral sensitivity, polarization vision, colored ocular filters, oil droplets, eye designs, etc.)
• Experiments of behavior and/or visual perception in attenuating media
• Visual neuroscience/neurobiology, e.g. dehazing computations in the retina
• Inter- and intra-species comparisons of animals living in episodic versus ever-present fog
• Cone catch and image formation models for vision in attenuating media
• Visibility limitations due to visual impairment/deficits of the individual
• Quantitative measurements of the photic environments of cloud forests and natural water bodies (e.g., intensity, spectrum, polarization, 3D distribution of light, optical thickness of fog/clouds etc.)
• Qualitative photographic collections featuring endemic species, life and light visual environments in cloud forests, turbid underwater habitats
• 3D Computer simulations of complex natural habitats under fog (terrestrial or aquatic)
If you are unsure about the suitability of your manuscript to this Research Topic, please do not hesitate to contact the Editors.
Topic Editor Michael Webster received a research contract from Johnson and Johnson Optical Division. Derya Akkaynak holds several patents but none related to visibility enhancement (‘Sea-thru’, whose patent is pending, is a color enhancement algorithm). The other Topic Editors declare no potential conflicts of interest related to this Research Topic.