The topic of cognitive perception (CP) researches the coordination principles between mid-and high-level perception on the one hand and knowledge-based control and general cognition on the other.
Real task environments are rich in structure and detail. Most of this complexity is irrelevant to an intelligent agent: An autonomous system must be able to discriminate what matters and ignore the rest. The perception of general learning systems, to successfully switch between various tasks, goals, and domains, must be "task-steerable” — i.e., have the ability to develop so as to attend to task-relevant environmental structures, in light of the actions of a situated agent.
The cognitive and perceptual mechanisms of many mammals have methods for achieving such task-directed perception. Most current perception approaches in artificial intelligence (AI) require too much guidance in their learning and after training they are inflexible. Moreover, they provide factorization that has too much general detail and too little task-relevant information. This places a high computational burden firstly on the perception system – which needs to capture the relatively small amount of relevant detail - and secondly on the intelligent system that, on the contrary, must deal with the over-abundance of irrelevant detail whose factorization is likely not tuned to the needs of the cognition in carrying out its tasks.
Research in CP aims to uncover the coordination principles between perception and higher-level cognition. A long history of research on each presents some advances, but an effective unification of the two is called for. Such a unified cognitive perception (CP) system would share an attention mechanism with higher-level cognition, as well as learning and other functions of high and low-level cognition/perception. This will solve the problem of what to learn in complex environments and represent perceptual data to provide the appropriate factorization relevant to the cognition at any point in time.
We are interested in papers from any field focusing on aspects of CP, from neuroscience to AI to artificial general intelligence (AGI) to data science. Areas of interest are attentional activation of perception, working memory representations — whether symbolic or sub-symbolic, long-term memory mechanisms and how they interact with working memory, and the operation and implementation of cognitive functions during lifelong learning.
Relevant topics include, but are not limited to:
- The role of top-down control in perception
- Unified top-down / bottom-up realtime perception.
- Bootstrapping perceptual learning
- Perception-based self-supervised learning
- Coordination of perception and cognition
- Statistical versus reasoning-based perceptual processing
- Neuro-symbolic approaches to cognitive perception
- Developmental cognitive perception
- Generality and autonomy in perception-based learning
- Active perception
- Dynamic resource allocation
- Precepts-concepts association
- Task-driven attention allocation
- Compositionality in perception
- Unsupervised learning of task-relevant representations
- Attentional mechanisms in perception
- Embodied perception
- Cognitive vs reactive perception
- The role of emotions in perception
The topic of cognitive perception (CP) researches the coordination principles between mid-and high-level perception on the one hand and knowledge-based control and general cognition on the other.
Real task environments are rich in structure and detail. Most of this complexity is irrelevant to an intelligent agent: An autonomous system must be able to discriminate what matters and ignore the rest. The perception of general learning systems, to successfully switch between various tasks, goals, and domains, must be "task-steerable” — i.e., have the ability to develop so as to attend to task-relevant environmental structures, in light of the actions of a situated agent.
The cognitive and perceptual mechanisms of many mammals have methods for achieving such task-directed perception. Most current perception approaches in artificial intelligence (AI) require too much guidance in their learning and after training they are inflexible. Moreover, they provide factorization that has too much general detail and too little task-relevant information. This places a high computational burden firstly on the perception system – which needs to capture the relatively small amount of relevant detail - and secondly on the intelligent system that, on the contrary, must deal with the over-abundance of irrelevant detail whose factorization is likely not tuned to the needs of the cognition in carrying out its tasks.
Research in CP aims to uncover the coordination principles between perception and higher-level cognition. A long history of research on each presents some advances, but an effective unification of the two is called for. Such a unified cognitive perception (CP) system would share an attention mechanism with higher-level cognition, as well as learning and other functions of high and low-level cognition/perception. This will solve the problem of what to learn in complex environments and represent perceptual data to provide the appropriate factorization relevant to the cognition at any point in time.
We are interested in papers from any field focusing on aspects of CP, from neuroscience to AI to artificial general intelligence (AGI) to data science. Areas of interest are attentional activation of perception, working memory representations — whether symbolic or sub-symbolic, long-term memory mechanisms and how they interact with working memory, and the operation and implementation of cognitive functions during lifelong learning.
Relevant topics include, but are not limited to:
- The role of top-down control in perception
- Unified top-down / bottom-up realtime perception.
- Bootstrapping perceptual learning
- Perception-based self-supervised learning
- Coordination of perception and cognition
- Statistical versus reasoning-based perceptual processing
- Neuro-symbolic approaches to cognitive perception
- Developmental cognitive perception
- Generality and autonomy in perception-based learning
- Active perception
- Dynamic resource allocation
- Precepts-concepts association
- Task-driven attention allocation
- Compositionality in perception
- Unsupervised learning of task-relevant representations
- Attentional mechanisms in perception
- Embodied perception
- Cognitive vs reactive perception
- The role of emotions in perception