Monitoring 2d environments with eye-tracking based agents
What kind of environments are our visual search strategies adapted to? Can they be transplanted to autonomous vehicles?
There is variation in the way people search for important features in their field of vision. The Virtual Eye project at OsloMet and Simula is developing computational models of eye gaze dynamics that capture this natural variation. It is thought that different search strategies are optimal in different types of environments (Schick et al. 2008). This thesis project will systematically test this hypothesis by developing simulations of environments monitored by artificial search agents. As a prototypical search task, the project will consider source localization (Hwang et al. 2019): for example, finding a food source by first locating a patch of odour, or finding an underwater leak by locating a patch of methane. To implement this, gradient-following will be added to current parameterized eye-gaze models. The efficiency of localization will be evaluated in environments that vary by patch size, patch complexity and patch sparsity. Patches will be generated from turbulent fluid motion using the software package GadenTools. The effect of model parameters will be evaluated across different environment types.
Goal
The goal of the thesis is to implement an autonomous vehicle simulation based on visual search models, and to identify the main relationships between optimal search parameters and environment properties.
Learning outcome
The student will gain experience with movement planning for autonomous vehicles, with applications in environmental monitoring. In addition, the student will have experience with high performance computing, running large simulation experiments, and drawing scientific conclusions.
Qualifications
The student should be comfortable with coding in Python and with differential equations. Experience with git/Github, and enrollment in cybernetics or robotics programs are advantageous.
Supervisors
- Alexander Szorkovszky
- Pedro Lind
Collaboration partners
- OsloMet AI Lab
References
- Hwang, J., Bose, N., & Fan, S. (2019). AUV adaptive sampling methods: A review. Applied Sciences, 9(15), 3145.
- Schick, R. S., Loarie, S. R., Colchero, F., Best, B. D., Boustany, A., Conde, D. A., ... & Clark, J. S. (2008). Understanding movement data and movement processes: current and emerging directions. Ecology Letters, 11(12), 1338-1350.