Vision and Haptics Lab

 Dr Peter Scarfe

The Vision and Haptics Lab is based in the School of Psychology and Clinical Language Sciences at the University of Reading. Work in the lab investigates human multi-sensory perception using techniques that straddle the boundary between Psychology and Engineering.


Current work is focused in three areas:


(1) Understanding how sensory data is used to construct internal models of the physical laws governing the environment and how these models shape our perception of the world.


(2) Determining how information from multiple sensory modalities is integrated for perception and visuomotor control (particularly vision and haptics).


(3) Elucidating the sensory information and internal models people use when actively moving and navigating within their environment.


Techniques we use to investigate these areas include: behavioural experiments and psychophysics, stereoscopic presentation, 3D immersive virtual reality, haptic robotics, machine learning and Bayesian modelling.





Visit to Shadow Robotics


The lab, together with Prof. William Harwins group in the School of Biological Sciences were able to visit Shadow Robotics last year to see their awesome haptic robotics equipment.


















RAIN Grant Accepted


The lab, together with Prof. William Harwins group in the School of Biological Sciences, has had a grant accepted to become part of the Robotics and AI in Nuclear (RAIN) consortium. The grant will fund a post-doctoral position in the lab to investigate ways in which to optimise bi-manual multi-finger haptic robotic and VR telepresence systems. On the grant we will be working closely with Generic Robotics, to integrate our hardware with the TOIA software system for rendering haptics in Unreal Engine.















Latest Paper from the lab


Hornsey, R., Hibbard, P. B. and Scarfe, P. (2020). Size and shape constancy in consumer virtual reality. Behaviour Research Methods, 1-12. pdf



Work in the lab would not be possible without generous funded from