Influence of eye-movements on multisensory stimulus localization: experiments, models and robotics applications

Abstract : To make sense of their environment, both humans and robots need to construct a consistent perception from many sources of information (including visual and auditory stimulation). Multimodal merging thus plays a key role in human perception, for instance by lowering reaction times and detection thresholds. Psychophysics experiments have shown that humans are able to fuse information in a Bayes optimal way (Ernst & Banks, 2002), weighting each modality by its precision (i.e. the inverse of its perceived variance). Weights are usually estimated a posteriori from experimental data, but the mechanisms by which agents may estimate such precision online are not well studied. Some propositions may stem from sensorimotor accounts of perception and the predictive coding framework, with actions (e.g. saccades) being used to simultaneously estimate stimulus localization and sensory precision (Friston et al., 2011). In the context of the AMPLIFIER (Active Multisensory Perception and LearnIng For InteractivE Robots) project (2018-2022), we study the mutual influence of multisensory fusion and active perception. The project combines three complementary components. First, psychophysics experiments contribute to the confirmation and refining of hypotheses, by manipulating stimuli and task constraints (e.g., audio-visual discrepancy, stimulus presentation time, number of fixations or saccades during presentation) and estimating their effect on saccadic eye movements, as well as the effects of eye movements on the localization of the target. Second, neurocomputational models based on the dynamic neural field framework provide distributed representations of stimuli, allow to replicate experimental data, and to make predictions. Finally, such models will be coupled with active decision-making and developmental sensorimotor contingencies learning to be embedded on social robotic platforms, to improve human-robot interactions by providing more natural (gaze) interactions and more appropriate reactions in complex environments.
Liste complète des métadonnées
Contributor : Mathieu Lefort <>
Submitted on : Friday, October 12, 2018 - 3:42:30 PM
Last modification on : Thursday, January 10, 2019 - 4:34:26 PM
Document(s) archivé(s) le : Sunday, January 13, 2019 - 2:44:28 PM


Files produced by the author(s)


  • HAL Id : hal-01894621, version 1


Mathieu Lefort, Jean-Charles Quinton, Simon Forest, Adrien Techer, Alan Chauvin, et al.. Influence of eye-movements on multisensory stimulus localization: experiments, models and robotics applications. Workshop on Models and Analysis of Eye Movements, Jun 2018, Grenoble, France. 〈hal-01894621v1〉



Record views


Files downloads