The Vision and Haptics Lab is based in the School of Psychology and Clinical Language Sciences at the University of Reading. Work in the lab investigates human multi-sensory perception using techniques that straddle the boundary between Psychology and Engineering.
Current work is focused in three areas:
(1) Understanding how to optimise bi-manual visual-haptic virtual reality systems; in particular in impact of transforms in the mapping between vision and touch on operator performance. This is part of the labs work with the RAIN Hub.
(2) Determining how information from multiple sensory modalities is integrated for perception and visuomotor control (particularly vision and haptics).
(3) Elucidating the nature of the internal models that humans construct from multi-sensory data. For example, the extent to which the brain models the physical laws governing the environment.
Techniques we use to investigate these areas include: behavioural experiments and psychophysics, stereoscopic presentation, virtual reality, haptic robotics, and computational modelling.