Dr Peter Scarfe
The Vision and Haptics Lab is based in the School of Psychology and Clinical Language Sciences at the University of Reading. Work in the lab investigates human multi-sensory perception using techniques that straddle the boundary between Psychology and Engineering.
Current work is focused in three areas:
(1) Understanding how sensory data is used to construct internal models of the physical laws governing the environment and how these models shape our perception of the world.
(2) Determining how information from multiple sensory modalities is integrated for perception and visuomotor control (particularly vision and haptics).
(3) Elucidating the sensory information and internal models people use when actively moving and navigating within their environment.
Techniques we use to investigate these areas include: behavioural experiments and psychophysics, stereoscopic presentation, 3D immersive virtual reality, haptic robotics, machine learning and Bayesian modelling.
Royal Society Summer Exhibition
Between July 4th and 11th I took part in the Royal Society Summer Exhibition running a virtual reality comet experience at "The Comet Revealed: Rosetta and Philae at Comet 67P". This allowed members of the public to see Comet 67P, the Rosetta spacecraft and Philae in virtual reality. The VR experience was extremely popular throughout the whole week. More information available here.
Royal Society Grant Accepted
A matter of days after our British Academy grant success, the Royal Society contacted us to inform us that our grant to study "Sensitivity to object depth in three-dimensional visual haptic environments" has been accepted. The project is in collaboration with Prof. Paul Hibbard at the University of Essex and will look at how depth judgements are made from vision and touch in highly realistic 3D scenes consisting of laser scans of real-word objects.
British Academy Grant Accepted
Our British Academy grant on "The role of material appearance in multisensory perception" has been accepted. The project will look at how cues are combined from vision and touch for judging (a) material categories such as "wood", "metal" and "plastic", and (b) physical properties such as surface orientation. We will model this process in a Bayesian causal inference framework.
Latest Paper from the lab
Hornsey, R. L., Hibbard, P. B. and Scarfe, P. (2015). Ordinal judgements of depth in monocularly- and binocularly-viewed photographs of complex natural scenes. Proceedings of the International Conference of 3D Imaging. pdf