Vision and Haptics Lab

 Dr Peter Scarfe

The Vision and Haptics Lab is based in the School of Psychology and Clinical Language Sciences at the University of Reading. Work in the lab investigates human multi-sensory perception using techniques that straddle the boundary between Psychology and Engineering.


Current work is focused in three areas:


(1) Understanding how sensory data is used to construct internal models of the physical laws governing the environment and how these models shape our perception of the world.


(2) Determining how information from multiple sensory modalities is integrated for perception and visuomotor control (particularly vision and haptics).


(3) Elucidating the sensory information and internal models people use when actively moving and navigating within their environment.


Techniques we use to investigate these areas include: behavioural experiments and psychophysics, stereoscopic presentation, 3D immersive virtual reality, haptic robotics, machine learning and Bayesian modelling.





First PhD Student graduates: Congratulations to Dr. Mark Adams


A very proud moment as my first PhD student passes his viva. Mark was co-supervised by Prof. Andrew Glennerster and Prof. William Harwin. Marks research focused on how vision and touch are integrated when localising objects in the environment. The research combined immersive virtual reality and spatially co-aligned haptic robotics. Mark is now a post-doc in Dundee.



Lab@VSS 2018


Research from the lab was presented on four posters at this years VSS conference. Click on the links to get copies of the posters.


(1) Saturday Morning, poster 23.369: Multisensory Detection: Using Vision and Haptics to detect hidden objects. Julie Skevik and Peter Scarfe. poster


(2) Sunday Morning, poster 33.450: Perception of Object Movement in Virtual Reality. Rowan T Hughes, Peter Scarfe, Paul B. Hibbard and Loes C. J. van Dam. poster


(3) Sunday Morning, poster 33.454: Detecting 3D location change in the presence of grouping cues. Ellis L. Gootjes-Dreesbach, Peter Scarfe and Andrew Glennerster. poster


(4) Monday Morning, poster 43.360: Experimentally disambiguating models of sensory

cue combination. Peter Scarfe and Andrew Glennerster. poster




Oculus Grant Accepted


Our lab, in conjunction with Dr. Loes van Dam and Prof. Paul Hibbard from the University of Essex, have been successfully funded by Oculus Research for a project investigating how information is integrated across the senses in virtual reality. Promises to be a fantastic project at the crossover between applied and basis research.









Latest Paper from the lab


Hornsey, R. L., Hibbard, P. B. and Scarfe, P. (2016). Binocular depth judgements on smoothly curved surfaces. PLOS One, 11 (11), 1-18. pdf



Work in the lab would not be possible without generous funded from