Vision and Haptics Lab

 Dr Peter Scarfe

The Vision and Haptics Lab is based in the School of Psychology and Clinical Language Sciences at the University of Reading. Work in the lab investigates human multi-sensory perception using techniques that straddle the boundary between Psychology and Engineering.

 

Current work is focused in three areas:

 

(1) Understanding how to optimise bi-manual visual-haptic telepresence systems; in particular in impact  of transforms in the mapping between vision and touch on operator performance. This is part of the labs work with the RAIN Hub.

 

(2) Determining how information from multiple sensory modalities is integrated for perception and visuomotor control (particularly vision and haptics).

 

(3) Elucidating the nature of the internal models that humans construct from multi-sesnory data. For example, the extent to which the brain models the physical laws governing the environment.

 

Techniques we use to investigate these areas include: behavioural experiments and psychophysics, stereoscopic presentation, virtual reality, haptic robotics, and computational modelling.

 

 

NEWS

 

OSA Incubator on Visual Perception in AR/VR

 

Peter will be speaking at the this meeting on Content Driven AR/VR Applications. The meeting was to be in Washington USA, but will now be virtual. Its runs 22-25th of September. See here for more details.

 

 

Human Robot Inteaction (HRI) Workshop (RAIN)

 

Peter and William Harwin will be running the Haptics session of the HRI workshop as part of the RAIN Hub, you can sign up to attend the workshop for free at here. The workshop runs 28th September through 1st October. Sessions include: VR/AR, Communicaiton and Perception, Teleoperation, Human Factors, Trustworthiness, Haptics, and Shared Control.

 

 

Latest progress on RAIN Grant

 

Since being able to get back into the lab we have been making fantastic progress on our grant with the RAIN Hub. The picture below shows the four custom-made haptic force-feedback devices, mounted on our new rigging along with four Vicon Vero 2 cameras. Next step is to calibrate the setup to get spatially co-aligned bi-manual haptics (two points of contact per hand) with virtual reality provided by an Oculus headset.

 

 

 

 

 

 

 

 

 



 

 

 

 

 

 

 

 

 

Visit to Shadow Robotics

 

Last year prior to COVID lockdown, the lab, together with Prof. William Harwins group in the School of Biological Sciences were able to visit Shadow Robotics last year to see their awesome haptic robotics equipment.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Latest Paper from the lab

 

Scarfe, P. (submitted). Experimentally disambiguating models of sensory cue combination. preprint

 

Hornsey, R., Hibbard, P. B. and Scarfe, P. (2020). Size and shape constancy in consumer virtual reality. Behaviour Research Methods, 1-12. pdf

 

 

Work in the lab would not be possible without generous funded from