Abstract
The deep layers of the superior colliculus (SC) integrate information from multiple senses to initiate orienting movements in vertebrate animals. A probabilistic model of the SC based on an interpretation of the neuroscientific data has been proposed by Anastasio et. al. [1]. By incorporating this SC model, in the form of an artificial neural network, as the decision mechanism for a system with two senses, hearing and vision, we have constructed and tested a Self-Aiming Camera (SAC). SAC senses and directs its lens toward the best "target" currently in the environment at any moment. Experiments were performed with SAC using several algorithms for combining the multisensory data as a comparison against the SC model. Generally, the SC model is superior in dealing with low amplitude signals and at least equal to any ad hoc model for the full range of unimodal and bimodal targets.
Original language | English (US) |
---|---|
Title of host publication | Proceedings of the International Joint Conference on Neural Networks |
Pages | 3201-3206 |
Number of pages | 6 |
Volume | 4 |
State | Published - 2003 |
Event | International Joint Conference on Neural Networks 2003 - Portland, OR, United States Duration: Jul 20 2003 → Jul 24 2003 |
Other
Other | International Joint Conference on Neural Networks 2003 |
---|---|
Country/Territory | United States |
City | Portland, OR |
Period | 7/20/03 → 7/24/03 |
ASJC Scopus subject areas
- Software
- Artificial Intelligence